this post was submitted on 23 Apr 2024
907 points (97.1% liked)

Technology

59963 readers
3495 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] jupiter_jazz@lemmy.dbzer0.com 6 points 7 months ago (2 children)

I don't think you are accounting for ease of use. It took time and skill for an individual to photoshop someone else. This is just an app. It takes more effort to prove the truth, then it does to create a lie. Not to mention, how in the other article it explains that people are using this to bait children. :/

[–] Drewelite@lemmynsfw.com 3 points 7 months ago

It takes more effort to prove the truth, then it does to create a lie.

And this universal truth, that's existed since the dawn of time, will now have to be reckoned with. The ease of use is exactly its undoing as something that has power over us. When anyone can do it, it all just becomes background noise.

[–] CaptainEffort@sh.itjust.works -5 points 7 months ago (1 children)

You don’t think it’s easy for someone to simply imagine another naked? That’s no different than this - it’s all a fantasy. None of it’s real.

[–] inb4_FoundTheVegan@lemmy.world 6 points 7 months ago (1 children)

There is a huge ass difference between your imagination and REAL MEDIA using my face. This is a absolute bullshit justification.

[–] CaptainEffort@sh.itjust.works -5 points 7 months ago (1 children)

How? In my mind, for this scenario, I can picture your face literally perfectly. It is, for all intents and purposes, your real face. In this case what I imagine in my head is identical to what some ai model would churn out.

[–] inb4_FoundTheVegan@lemmy.world 3 points 7 months ago* (last edited 7 months ago) (1 children)

Is a picture no different than an immigration? Is CGI not a picture because it used a computer?

Of course not because REAL MEDIA is being produced. Your thoughts never leave your head, these images are being used to harass and blackmail women. Your argument is completely asinine.

[–] CaptainEffort@sh.itjust.works -2 points 7 months ago

Hold up, that’s a separate issue. Revenge porn is flat out illegal, so using nudes of people, real or not, as blackmail, isn’t up for debate here. Whether or not it’s obvious, and I’m sorry if it’s not, I’m 100% with you that that’s completely disgusting and shouldn’t be tolerated.

Back to the first part though, is the problem literally just that it exists outside of the persons head? If they don’t share it with anyone, what’s really the difference from them imagining it? In both scenarios they’re effectively getting the same result, and nobody else is affected.