this post was submitted on 29 Mar 2024
341 points (93.4% liked)

Technology

59605 readers
3434 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

top 50 comments
sorted by: hot top controversial new old
[–] RobotToaster@mander.xyz 122 points 8 months ago (11 children)

This is only going to get easier. The djinn is out of the bottle.

[–] goldteeth@lemmy.dbzer0.com 78 points 8 months ago* (last edited 8 months ago) (1 children)

"Djinn", specifically, being the correct word choice. We're way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We're back into fuckin'... shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

[–] BrokenGlepnir@lemmy.world 14 points 8 months ago (1 children)

Have you ever seen the wishmaster movies?

load more comments (1 replies)
[–] conciselyverbose@sh.itjust.works 42 points 8 months ago (4 children)

Doesn't mean distribution should be legal.

People are going to do what they're going to do, and the existence of this isn't an argument to put spyware on everyone's computer to catch it or whatever crazy extreme you can take it to.

But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

[–] treadful@lemmy.zip 16 points 8 months ago

Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

load more comments (3 replies)
load more comments (9 replies)
[–] guyrocket@kbin.social 86 points 8 months ago (20 children)

This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me. The result is the same: fake porn/nudes.

And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

[–] kent_eh@lemmy.ca 104 points 8 months ago (9 children)

People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me

Because now it's faster, can be generated in bulk and requires no skill from the person doing it.

[–] ArmokGoB@lemmy.dbzer0.com 26 points 8 months ago (3 children)

I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

[–] 0x0@programming.dev 13 points 8 months ago

Those were the days...

load more comments (2 replies)
load more comments (8 replies)
[–] echo64@lemmy.world 58 points 8 months ago (13 children)

I hate this: "Just accept it women of the world, accept the abuse because it's the new normal" techbro logic so much. It's absolutely hateful towards women.

We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

[–] AquaTofana@lemmy.world 50 points 8 months ago

I don't know why you're being down voted. Sure, it's unfortunately been happening for a while, but we're just supposed to keep quiet about it and let it go?

I'm sorry, putting my face on a naked body that's not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it's actually believable because it's AI generated. That is SO much worse/psychologically damaging if they find out about it.

[–] brbposting@sh.itjust.works 15 points 8 months ago

It’s unacceptable.

We have legal and justice systems to deal with this.

For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

Telegram got right on it (not). Fuckers.

load more comments (11 replies)
[–] EatATaco@lemm.ee 21 points 7 months ago (1 children)

I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family's faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

I agree there is nothing to be done, but it's painfully obvious to me that the scale and ease of it that makes it much more concerning.

load more comments (1 replies)
[–] Assman@sh.itjust.works 13 points 8 months ago

The same reason AR15 rifles are different than muskets

load more comments (16 replies)
[–] GrymEdm@lemmy.world 71 points 7 months ago* (last edited 7 months ago) (4 children)

To people who aren't sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There's also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it's like watching video the next day of yourself undergoing sex without consent as if you'd been drugged.

I'll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can't make that decision for others or purge the internet, but the fact that there's such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

I get that people say this is the new normal, but it's already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

[–] lud@lemm.ee 15 points 7 months ago (2 children)

once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

Not saying that they are justified or anything but wouldn't people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

[–] eatthecake@lemmy.world 21 points 7 months ago (1 children)

The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you'll get a whole lot of complex PTSD instead.

[–] stephen01king@lemmy.zip 21 points 7 months ago (2 children)

People used to think their lives are over if they were caught alone with someone of the opposite sex they're not married to. That is no longer the case in western countries due to normalisation.

The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

load more comments (2 replies)
[–] Drewelite@lemmynsfw.com 18 points 7 months ago (2 children)

I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it'll still have an effect. Like social media, though it's normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

load more comments (2 replies)
load more comments (3 replies)
[–] JackGreenEarth@lemm.ee 56 points 8 months ago (15 children)

That's a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they're purposefully exploiting people who aren't tech savvy.

[–] Khrux@ttrpg.network 56 points 8 months ago* (last edited 8 months ago) (1 children)

I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

I wish everyone involved in this use of AI a very awful day.

[–] sentient_loom@sh.itjust.works 14 points 8 months ago (1 children)

Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.

[–] istanbullu@lemmy.ml 45 points 8 months ago

it's a "I don't know tech" tax

[–] echo64@lemmy.world 44 points 8 months ago (9 children)

The people being exploited are the ones who are the victims of this, not people who paid for it.

load more comments (9 replies)
[–] oce@jlai.lu 29 points 8 months ago

That's like 80% of the IT industry.

[–] sugar_in_your_tea@sh.itjust.works 25 points 8 months ago

IDK, $10 seems pretty reasonable to run a script for someone who doesn't want to. A lot of people have that type of arrangement for a job...

That said, I would absolutely never do this for someone, I'm not making nudes of a real person.

[–] IsThisAnAI@lemmy.world 14 points 8 months ago* (last edited 8 months ago)

Scam is another thing. Fuck these people selling.

But fuck dude they aren't taking advantage of anyone buying the service. That's not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

NOBODY on that side of the equation are bring exploited 🤣

load more comments (9 replies)
[–] flower3@feddit.de 53 points 8 months ago (10 children)

I doubt tbh that this is the most severe harm of generative AI tools lol

[–] Sanctus@lemmy.world 23 points 8 months ago (2 children)

Pretty sure we will see fake political candidates that actually garner votes soon here.

load more comments (2 replies)
load more comments (9 replies)
[–] General_Effort@lemmy.world 40 points 8 months ago* (last edited 8 months ago) (4 children)

Porn of Normal People

Why did they feel the need to add that "normal" to the headline?

[–] sentient_loom@sh.itjust.works 62 points 8 months ago

To differentiate from celebrities.

[–] TheGrandNagus@lemmy.world 15 points 8 months ago

Because it's different to somebody going online and finding a stock picture of Taylor Swift

load more comments (2 replies)
[–] curiousaur@reddthat.com 22 points 8 months ago

You can get 300 tokens in pornx dot ai for $9.99. This guy is ripping people off.

[–] anticurrent@sh.itjust.works 18 points 7 months ago

We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

[–] SendMePhotos@lemmy.world 18 points 8 months ago (14 children)

I'd like to share my initial opinion here. "non consential Ai generated nudes" is technically a freedom, no? Like, we can bastardize our president's, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

[–] Maggoty@lemmy.world 18 points 7 months ago (2 children)

It's a far cry from making weird memes to making actual porn. Especially when it's not easily seen as fake.

load more comments (2 replies)
[–] LadyAutumn@lemmy.blahaj.zone 17 points 7 months ago* (last edited 7 months ago) (7 children)

They're making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.

Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren't taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It's fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It's terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.

load more comments (7 replies)
[–] antlion@lemmy.dbzer0.com 15 points 7 months ago

Seems to fall under any other form of legal public humiliation to me, UNLESS it is purported to be true or genuine. I think if there’s a clear AI watermark or artists signature that’s free speech. If not, it falls under Libel - false and defamatory statements or facts, published as truth. Any harmful deep fake released as truth should be prosecuted as Libel or Slander, whether it’s sexual or not.

load more comments (11 replies)
load more comments
view more: next ›