this post was submitted on 26 Jan 2024
290 points (87.4% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

you are viewing a single comment's thread
view the rest of the comments
[–] GilgameshCatBeard@lemmy.ca -4 points 10 months ago* (last edited 10 months ago) (2 children)

So she’s less a victim because she’s wealthy? My god you people can justify anything, can’t you?

[–] Tangent5280@lemmy.world 8 points 10 months ago (1 children)

That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she's less victim than Housewife Community leader preschool teacher Margaret from Montana.

[–] MJKee9@lemmy.world 7 points 10 months ago (1 children)

You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn't a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you're either being intellectually dishonest or living in a dream world.

Even the law agrees. It's a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that's the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

[–] GilgameshCatBeard@lemmy.ca -2 points 10 months ago (1 children)

The wealth of the victim doesn’t change the crime.

[–] MJKee9@lemmy.world 0 points 9 months ago (1 children)
[–] GilgameshCatBeard@lemmy.ca 1 points 9 months ago (1 children)

So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

[–] MJKee9@lemmy.world 1 points 9 months ago (1 children)

It can't be a crime unless there is a criminal statute that applies. See if you can find one thst applies.

[–] GilgameshCatBeard@lemmy.ca 0 points 9 months ago (1 children)

So there’s no victims. Rich or poor. Why is this a problem?

[–] MJKee9@lemmy.world 0 points 9 months ago

Your response doesn't logically respond to my comment. It attempts to reframe the argument by setting up a "strawman," and shows that you fail to understand (or choosing to ignore because it doesn't support your new reframed argument) the difference between civil and criminal law in the United States.