this post was submitted on 23 Apr 2024
907 points (97.1% liked)

Technology

59495 readers
3081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] dream_weasel@sh.itjust.works 47 points 7 months ago (6 children)

Is there such a thing as a consensual undressing app? Seems redundant

[–] chatokun@lemmy.dbzer0.com 27 points 7 months ago (1 children)

There isn't, but emphasis on why it's an issue is always a good thing to do. Same reason people get upset when some articles say "had sex with a minor" or "involved in a relationship with a minor" when the accurate crime is "raped a minor."

[–] HelloHotel@lemmy.world 3 points 7 months ago* (last edited 7 months ago)

If you (the news) are going to use flowery language, at least imply its a crime!

  • "Sexually coersed a minor"
  • or "groomed a minor for sex"
  • or "had a relationship where the power dynamics were so 1 sided that the child could not give consent"
  • or mabe just say "raped a minor"

Its not that hard!

[–] ehxor@lemmy.ca 15 points 7 months ago

BRB. Got an idea for a start-up

[–] lengau@midwest.social 14 points 7 months ago (2 children)

Theoretically any of these apps could be used with consent.

In practice I can't imagine that would be a particularly large part of their market...

[–] Schadrach@lemmy.sdf.org 7 points 7 months ago (1 children)

Now I have this image of an OnlyFans girl who just fake nudes all her pictures. Would make doing public nudity style pictures a lot easier.

[–] lengau@midwest.social 4 points 7 months ago

Y'know, as long as she's open about it that would be a great use of the tech.

[–] dream_weasel@sh.itjust.works 6 points 7 months ago

"hey send me some nudes!"

"Ugh... I'm already on the couch in my pajamas. Here's a pic of me at the coffee shop today, just use the app, it's close enough."

[–] reverendsteveii@lemm.ee 9 points 7 months ago (1 children)

I guess it's really in whether you use it with consent. I used one on my own picture just to see how it worked. It gave me huge tits but other than that was scarily accurate.

[–] dream_weasel@sh.itjust.works 3 points 7 months ago (1 children)

Neat. Ladies only or does it do dudes too?

[–] reverendsteveii@lemm.ee 7 points 7 months ago (1 children)

I'm a dude, it's just a clever name. It'll do dudes, it's just gonna give you huge tits. What you're into is, of course, your business.

[–] evranch@lemmy.ca 5 points 7 months ago (1 children)

My interest in this topic just went from 0 to 10 upon realizing the humour potential of passing it around to see all my bros with huge tits, but only if it worked like a Snapchat filter.

Also I have a friend who already has huge tits, and I've seen them IRL so I'm curious what it would do

[–] Schadrach@lemmy.sdf.org 2 points 7 months ago (1 children)

Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do

Being serious for a moment, it depends on the source image. If it can tell where the contours of the tits are in the source image, they'll be closer to the right size and shape - otherwise it's going to find something it thinks are the contours and map out tits that match those, then generic torso that matches the shape of where it thinks the torso is and skintone of the face. It's not magic, it's just automating what a horndog with photoshop, a photo of you and a big enough porn collection to find someone with a similar body type could do back in the 90s.

[–] evranch@lemmy.ca 1 points 7 months ago

I'm familiar with how ML works so it's not magic to me either, but the actual result is what would intrigue me. Since she has big naturals obviously they hang pretty heavy when they're set free.

But if I fed it a picture of her wearing a tight push-up bra, which could easily give off the impression that she had implants, would I get a pair of bolt-ons back? Or would it be able to pick up on the signs of real tits and add some sag?

Seeing how it'll put tits on men it's obviously not an exact science lol

[–] UnderpantsWeevil@lemmy.world 9 points 7 months ago* (last edited 7 months ago) (2 children)

I assume that's what you'd call OnlyFans.

That said, the irony of these apps is that its not the nudity that's the problem, strictly speaking. Its taking someone's likeness and plastering it on a digital manikin. What social media has done has become the online equivalent of going through a girl's trash to find an old comb, pulling the hair off, and putting it on a barbie doll that you then use to jerk/jill off.

What was the domain of 1980s perverts from comedies about awkward high schoolers has now become a commodity we're supposed to treat as normal.

[–] CaptainEffort@sh.itjust.works 5 points 7 months ago (1 children)

Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.

[–] UnderpantsWeevil@lemmy.world 1 points 7 months ago (1 children)

Idk how many people are viewing this as normal

Maybe not "Lemmy" us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.

[–] CaptainEffort@sh.itjust.works 2 points 7 months ago

Yeah damn, that’s true.

An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.

Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.

[–] creditCrazy@lemmy.world 4 points 7 months ago

You just took my feeling on this issue and put it to words

[–] melpomenesclevage@lemm.ee 3 points 7 months ago

that would just be instructions, wouldn't it?