this post was submitted on 23 Apr 2024
907 points (97.1% liked)

Technology

59569 readers
4136 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] sugar_in_your_tea@sh.itjust.works 1 points 7 months ago (1 children)

Right, they're orthogonal concepts. If something is protected by fair use laws, then privacy laws don't apply. If privacy laws apply, then it's not fair use.

The proper discussion in this area is around libel law. That's where tabloids are usually sued, not for fair use or "privacy violations." For a libel suit to succeed, the plaintiff must prove that the defendant made false statements that caused actual harm to the plaintiff's reputation. There are a bunch of lawsuits going on right now examining deep fakes and similarly allegedly libelous use of an individuals likeness. For a specific example, look at the Taylor Swift lawsuit around deep fake porn.

But the crux of the matter is that you ain't have a right to your likeness, generally speaking, and fair use laws protects creepy use of legally acquired representations of your likeness.

[–] Couldbealeotard@lemmy.world 1 points 7 months ago (1 children)

The Deepfake stuff is interesting from a legal standpoint, and that is essentially the topic of this thread. When Deepfake first became a thing, many companies (like Reddit) chose to ban the content; they did so voluntarily, perhaps a mix of morality and liability issues.

What I referred to regarding tabloids is that there have been many cases of paparazzi being fined for breaches of privacy, not libel. From my studies I recall a good example being "if you need a ladder to see over someone's fence you are invading the expectation of privacy". This was before drones were a thing so I don't know how it applies these days.

I agree that you should have control of your likeness, but I don't think it is as protected as your comment suggests.

Privacy violations fall under illegally acquiring content, which is not subject to free use.

But if I obtain it legally, I either own it (e.g. I took the picture) or am subject to free use restrictions (e.g. I got it from your website or something). In that case, your only real way to stop me is with libel law (and similar). If I manipulate the video in such a way as to make false, damaging statements about you, you can sue for libel.

That's my point. So if I only use legally obtained pictures, I can do whatever I want with those for my own personal use. If I share them, I may be subject to libel laws, fair use laws (e.g. if I took it from your website), etc.

So I don't see a realistic defense on privacy grounds for deep fakes, it's going to be fraud and libel laws, assuming the source pictures were obtained legally. The pictures being creepy isn't the issue, at least as far as the courts are concerned.