this post was submitted on 20 May 2024
304 points (97.2% liked)

Technology

72769 readers
1532 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AbouBenAdhem@lemmy.world 21 points 1 year ago* (last edited 1 year ago) (5 children)

The ChatGPT case aside, what are the copyright laws on impersonating the voice of an actor portraying a particular film character? If someone imitates the voice of Johnny Depp playing Jack Sparrow, or Andy Serkis playing Gollum, but makes no reference to the character apart from the voice performance, does that infringe on the copyright to the character?

[–] kbin_space_program@kbin.run 12 points 1 year ago (1 children)

I believe the SAG(Screen Actors Guild) just had a big strike about AI likenesses without proper compensation.

[–] jivandabeast@lemmy.browntown.dev 8 points 1 year ago (1 children)

They did, but I'm not sure that applies in this case unless for some reason OpenAI signed a deal with SAG. Otherwise, they aren't beholden to any protections not afforded by the law.

At least, AFAIK. Someone with more legal knowledge should probably chime in

[–] kbin_space_program@kbin.run 1 points 1 year ago

Open AI seems to have dealt with their massive legal issues so far by signing caught red handed agreements.

[–] Talaraine@kbin.social 9 points 1 year ago (1 children)

There are no applicable laws that I know about, and as a voice actor I am similarly concerned. There is a lot of focus on this in the industry atm, but we all know how glacially slow government moves. SAG-AFTRA and NAVA have this as a focus currently, and I'm watching with interest.

[–] towerful@programming.dev 5 points 1 year ago* (last edited 1 year ago) (1 children)

Apparently Amelia Tyler - the Narrator for BG3 - checked in on some random twitch stream, and they had an AI voice trained from her narration controlled by twitch chat - which was saying some fucking horrendous stuff.

Scary as fuck.

Remember to talk to everyone you know about voice scams. Scammers absolutely are leveraging this tech, and piling it on top of the usual "I've flushed my phone down the toilet, I'm texting from a mates phone and I need money to buy a new one for my job interview tomorrow" kinda scams.
Agree on a password or something, so that if "you" ever call (edit: or text) and put them under pressure then they ask for the password. Scammers will instantly divert or bail.

[–] assassin_aragorn@lemmy.world 2 points 1 year ago

Aw man, Amelia Tyler is so amazing too. Her narration is amazing, and she's got a lot of funny TikTok videos

[–] SlopppyEngineer@lemmy.world 4 points 1 year ago (1 children)

Apparently the answer is no. There is even someone making a KITT app using a LLM, but using an AI voice based on a voice imitator as they didn't got the rights from William Daniels and that's legal.

[–] PipedLinkBot@feddit.rocks 1 points 1 year ago

Here is an alternative Piped link(s):

someone making a KITT app

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] devfuuu@lemmy.world 1 points 1 year ago

Let's just make the worse possible and then wait to see what happens. Only way to wake up from the AI nightmare. If we survive long enough.

[–] erwan@lemmy.ml 0 points 1 year ago

IANAL but probably not