How the upcoming AI legislations around the world, like voice cloning prevention and disclosure requeriment of techincal details of models, will affect open source or selfhosted models?

  • Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    How do you implement voice cloning prevention? Human voices aren’t that unique. Also, AI voice cloning isn’t perfect. So… At what threshold is a voice considered, “cloned” from a legal perspective?

    I mean, people couldn’t tell the difference between Scarlet Johansson and OpenAI’s “Sky” voice which was not cloned.

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      If you’re selling or publishing a voice in a way that impersonates another person without their consent that may be identifiable and prosecutable. “Generate with x voice.” 'Talk to x." Etc. Exact lettering is no necessary if intent is evident from pictures or evasive descriptions making an obvious implication.

      If prosecution can find evidence of cloning/training that can also serve as basis.

      In these ways it doesn’t have to be about similarity of the produced voice, of quality or alternative people, at all.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        You make AI voice generation sound like it’s a one-step process, “clone voice X.” While you can do that, here’s where it’s heading in reality:

        “Generate a voice that’s sounds like a male version of Scarlett Johansson”.

        “That sounds good, but I want it to sound smoother.”

        “Ooh that’s close! Make it slightly higher pitch.”

        In a process like that, do you think Scarlett Johansson would have legal standing to sue?

        What if you started with cloning your own voice but after many tweaks the end result ends up sounding similar to Taylor Swift? Does she have standing?

        In court, you’d have expert witnesses saying they don’t sound the same. “They don’t even have the same inflection or accent!” You’d have voice analysis experts saying their voice patterns don’t match. Not even a little bit.

        But about half the jury would be like, “yeah, that does sound similar.” And you could convict a completely innocent person.

    • kayzeekayzee@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      I think the main idea is to codify the act as illegal, so if it’s discovered that someone used voice cloning (for like a telephone scam or something), then they can be charged for that too. But yeah it might be hard to prove without a lot of evidence

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      4 days ago

      One country was already setting up copyright on your voice so AI can be served takedown notices. Voice are quite unique, its how my bank verifies who I am. If somebody clones my voice via AI it could fool that login system

      • krooklochurm@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        Your bank is run by fucking morons if they’ve allowed voice verification at any point after ~2 years ago.

        It’s a kind of profound, logarithmic stupidity that increases exponentially every day as voice cloning technology gets better and better.

        They are fucking stupid and don’t give one minuscule fuck about the security of your account.