YouTube could use AI to make you sound like your favorite music artist

0
72

[ad_1]

Sure, you idolize the music artists who fill streaming services, but wouldn’t you want to sound like them? Well, this is something that YouTube wants to help you do with an AI tool it’s working on. However, the company is facing some hurdles with it.

We’re all familiar with AI voice cloning. Artificial intelligence has the ability to emulate a person’s voice. It’s another technology that we first saw in the sci-fi movies. Now, it’s a reality, and we’ve seen some major effects thus far. We all remember the fake song by Drake and The Weeknd. Also, CD Projekt Red used AI to clone the voice of one of its deceased voice actors.

YouTube is working on an AI tool to make you sound like your favorite artists

At the moment, we have no idea if this tool will actually see the light of day. Sources close to the matter told Bloomberg (via Phone Arena) that the company has been reaching out to different record labels to discuss the tool. The sources have chosen to remain anonymous.

So, the company is asking record labels if they’d be happy with YouTube using its artists’ voices to train its AI models. That sounds like a steep ask, and as of yet, none of the labels gave YouTube the thumbs-up.

This is a controversial move

AI is nothing if not controversial. At this point, we’re starting to see the effects of AI on the tech, music, and art industries among others. Not everyone is too happy with the course of AI, and that’s for good reason.

With this YouTube AI tool, it’s obvious why this is an iffy move. YouTube is contacting the record labels and music companies for this tool. However, we have to think about how the individual artists feel about this. If you’ve built a career on your voice, you wouldn’t want to give that up for everyone to use. However, it seems that, if a music company accepts YouTube’s offer, the artists won’t have a say in the matter.

In any case, we should hear more about this feature as time goes on. For all we know, this tool could never happen.

[ad_2]

Source link