

You make AI voice generation sound like it’s a one-step process, “clone voice X.” While you can do that, here’s where it’s heading in reality:
“Generate a voice that’s sounds like a male version of Scarlett Johansson”.
“That sounds good, but I want it to sound smoother.”
“Ooh that’s close! Make it slightly higher pitch.”
In a process like that, do you think Scarlett Johansson would have legal standing to sue?
What if you started with cloning your own voice but after many tweaks the end result ends up sounding similar to Taylor Swift? Does she have standing?
In court, you’d have expert witnesses saying they don’t sound the same. “They don’t even have the same inflection or accent!” You’d have voice analysis experts saying their voice patterns don’t match. Not even a little bit.
But about half the jury would be like, “yeah, that does sound similar.” And you could convict a completely innocent person.


Like I said initially, how do we legally define “cloning”? I don’t think it’s possible to write a law that prevents it without also creating vastly more unintended consequences (and problems).
Let’s take a step back for a moment to think about a more fundamental question: Do people even have the right to NOT have their voice cloned? To me, that is impersonation; which is perfectly legal (in the US). As long as you don’t make claims that it’s the actual person. That is, if you impersonate someone, you can’t claim it’s actually that person. Because that would be fraud.
In the US—as far as I know—it’s perfectly legal to clone someone’s voice and use it however TF you want. What you can’t do is claim that it’s actually that person because that would be akin to a false endorsement.
Realistically—from what I know about human voices—this is probably fine. Voice clones aren’t that good. The most effective method is to clone a voice and use it in a voice changer, using a voice actor that can mimick the original person’s accent and inflection. But even that has flaws that a trained ear will pick up.
Ethically speaking, there’s really nothing wrong with cloning a voice. Because—from an ethics standpoint—it is N/A: There’s no impact. It’s meaningless; just a different way of speaking or singing.
It feels like it might be bad to sing a song using something like Taylor Swift’s voice but in reality it’ll have no impact on her or her music-related business.