this post was submitted on 04 Apr 2024
181 points (91.0% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 39 comments
sorted by: hot top controversial new old
[–] Just_Pizza_Crust@lemmy.world 106 points 7 months ago* (last edited 7 months ago) (2 children)

Softcore gilf porn created by an AI to sell state lottery tickets wasn't on my cards for 2024, but here we are.

[–] RealFknNito@lemmy.world 11 points 7 months ago

If it was on anyone's cards, they should win a cash prize. Like a lottery of some sort.

[–] BuryMyHorse@lemmy.world 6 points 7 months ago

Disgusting! Where?

[–] themeatbridge@lemmy.world 103 points 7 months ago (2 children)

Megan feared that the image could potentially get out

So here it is in a news article...

[–] conciselyverbose@sh.itjust.works 21 points 7 months ago (1 children)

I mean, they covered her face which is the only part that's actually her. I'm assuming she provided it for demonstration purposes, and her underlying point was that it absolutely can do harm.

[–] themeatbridge@lemmy.world 9 points 7 months ago

You're right. I just thought it was a funny juxtaposition.

[–] aseriesoftubes@lemmy.world 17 points 7 months ago

To be fair, KTTH is barely news.

[–] tal@lemmy.today 49 points 7 months ago (1 children)

“Our tax dollars are paying for that! I was completely shocked. It’s disturbing to say the least,” Megan explained to the Jason Rantz Show on KTTH.

I mean, I'd assume that the state lottery is revenue-positive. It's more like lottery players are paying for it.

Well, the lottery is known as the poor tax, so I guess they're not completely wrong.

[–] Nobody@lemmy.world 40 points 7 months ago (1 children)

AI hallucinates a request for a topless photo. Nothing fundamentally wrong with this technology at all. Keep pouring billions into it.

[–] MadBob@feddit.nl 8 points 7 months ago

It must be intelligent after all!

[–] Norgur@kbin.social 34 points 7 months ago (1 children)

Can we talk less about AI inevitably doing what AI always dies and fuck up and instead talk about the website that uses AI resources to dangle an even more juicy carrot in front of desperate people throwing away their money with the lottery?

[–] conciselyverbose@sh.itjust.works 7 points 7 months ago

“I also think whoever was responsible for it should be fired,” she added

Seriously, fire everyone in the chain who had direct knowledge it was going to happen and can't provide documented proof that they said how absolutely insanely fucking awful the idea was.

Image generation is cool technology, and while it (and LLMs even more) is limited and super oversold in terms of what it can do, I'm not anti-AI. But this is fucked up without the sex part.

[–] Greg@lemmy.ca 32 points 7 months ago (4 children)

I can't verify this story with any reputable sources. Is this real or just boomerbait?

[–] stoly@lemmy.world 29 points 7 months ago

This site is a complete right-wing boomerbait rag, never pay it any attention.

People think of WA and think of Seattle, then extrapolate. Seattle is really no different than places like Omaha where there is a more liberal, educated populace. They are surrounded by a state full of angry, ignorant people. This "newspaper" is for the angry types.

[–] Fubarberry@sopuli.xyz 18 points 7 months ago

The "test drive a win" where it would generate AI images of people as lottery winners was a real thing, and they have taken it down.

Only larger news outlet I see covering it is Fox news. They cite the "mynorthwest.com" as their main source, but they do say that they recieved a statement from the lottery confirming that it was shutdown for that reason:

Washington's Lottery confirmed to Fox News Digital that it shut down the site after being made aware of the purported image.

Obviously a lot of people don't like Fox news, but I don't think there's a political agenda where that statement shouldn't be trusted.

[–] BearOfaTime@lemm.ee 5 points 7 months ago

My Gen z and millennial friends are shockingly gullible and ignorant.

[–] Pissnpink@feddit.uk 4 points 7 months ago* (last edited 7 months ago)

Idk, Mynorthwest is a real source but it's mostly dull local news fare with a some good event coverage. KIRO is that branch and its okay, its center right, it certianly isn't the cinclair broadcasting station, thats komo 4. 710 sports is more center left but its sports. 770 KTTH where this article seems to be coming from is obviously garbage reactionary conservative radio, but that's what makes money in radio.

[–] webghost0101@sopuli.xyz 24 points 7 months ago

Lol, they didn't even try to test the system if this is the result. Ai isn't intelligent but humans still take the cake of stupidity by having brains and not using them.

Many public stable diffusion models have a bias, porn often being overrepresented but all it takes is a "nude, naked, erotic, sex, nsfw " in the negative prompt and unless the model is build to only generate porn this will never happen. Or better yet, use some of that corporate money and build their own sd model that is verified to not included any nudity in its training data.

[–] TORFdot0@lemmy.world 19 points 7 months ago (2 children)

Is it sad the first thing i noticed is she has 6 fingers on her right hand?

[–] IggyTheSmidge@kbin.social 12 points 7 months ago (1 children)

It depends - are you Inigo Montoya?

But only four knuckles, so that's nice.

[–] Neato@ttrpg.network 12 points 7 months ago (4 children)

When Megan, a 50-year-old mother based in Tumwater, visited the new AI-powered mobile site from Washington’s Lottery on March 30, she thought she was in for some frivolous fun. Test Drive A Win allows users to digitally throw a dart at a dartboard featuring dream vacations you can pay for with the money you win in the lottery. Depending on where the dart lands, you can either upload a headshot or take one on your phone to upload, and the AI superimposes your image into the vacation spot.

Megan landed on a “swim with the sharks” dream vacation option. She was shocked at one of the AI photos Washington’s Lottery spit out. It was softcore porn.

So I can totally see this happening. Government contracts with an genAI company and company drops the ball and erroneously includes the function for pornography or doesn't select the correctly curated training data (I'm unsure how exactly these work). It may be quite difficult to spot this error by the Washington government is the occurrence rate is very low or none of their test training data prompted pornography to be generated. Perhaps it was only keyed to make porn (when not specifically prompted to) on certain subsets of matched facial features? I'm not suggesting this, but perhaps that affected user looks a lot like a popular porn star? It could also totally be the government's fault for quickly selecting an AI package and not looking what it could do; but with government bureaucracy there could've been quite a few people with oversight.

My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it? Why wouldn't you just take the money and buy your own? Maaaaybe if it heavily discounts the vacations or something. Seems like an unnecessary step in the lottery process.

[–] Kbin_space_program@kbin.social 18 points 7 months ago (2 children)

It's a core problem with image generator LLMs. For some fucking reason they seem to have fed them the content from sites that had a lot of porn. Guessing Imgur and Deviantart.

Literally the first time I tried to use MS's image generator, was out with some friends trying a new fried chicken place and we were discussing fake tinder profiles.

So I thought to try it and make a fake image of "woman senuously eating fried chicken".
Content warning, blah blah blah.

Try "Man sensuously eating fried chicken". Works fine.

We were all mystified by that. I went back a few days later to play around. Tried seeing what it didn't like. Tried generating "woman relaxing at park".
Again, content warning. Switch to a man, no problem. Eventually got it to generate with "woman enjoying sunset in a park." Got a very dark image, because it generated a completely nude woman T-posing in the dark.

So, with that in hand I went back and started specifying "fully clothed" for a prompt involving the word "woman". All of a sudden all of the prompts worked. They fed the bot so much porn that it defaulted women to being nude.

[–] Neato@ttrpg.network 4 points 7 months ago (1 children)

Lol at t-posing pornography.

I find the same problem when searching for D&D portraits. Men? Easy and varied. Women? Hypersexualized and mostly naked. I usually have to specific old women to prevent that.

To be fair, D&D was historically a game for neckbeards (at least that was the stigma/stereotype), so hypersexualized women fits the bill.

[–] Taako_Tuesday@lemmy.ca 2 points 7 months ago (1 children)

Doesn't it also have to do with the previous requests the LLM has recieved? In order for this thing to "learn" it has to know what people are looking for, so i've always imagined the porn problem as being a result of the fact that people are using these things to generate porn at a much greater volume than anything else, especially porn of women, so it defaults to nude because that's what most requests were looking for

[–] TheRealKuni@lemmy.world 3 points 7 months ago

Nah, most of these generative models don’t account for previous requests. There would be some problems if they did. I read somewhere that including generative AI data in generative AI training has a feedback effect that can ruin models.

It’s just running a bunch of complicated math against previously trained algorithms.

[–] orclev@lemmy.world 16 points 7 months ago

My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it?

No, it's advertising. They're trying to convince people to play the lottery so they have you roll a (virtual) wheel and upload a head shot then it generates a theoretical video of what it might look like if you went on that vacation (using your theoretical future winnings). It's absolutely idiotic, but their target demographic isn't exactly the sharpest tools in the shed to begin with.

[–] chrash0@lemmy.world 6 points 7 months ago (1 children)

they likely aren’t creating the model themselves. the faces are probably all the same AI girl you see everywhere. you gotta be careful with open weight models because the open source image gen community has a… proclivity for porn. there’s not a “function” per se for porn. the may be doing some preprompting or maybe “swim with the sharks” is just too vague of a prompt and the model was just tuned on this kind of stuff. you can add an evaluation network to the end to basically ask “is this porn/violent/disturbing”, but that needs to be tuned as well. most likely it’s even dumber than that where the contractor just subcontracted the whole AI piece and packages it for this use case

[–] Sabata11792@kbin.social 6 points 7 months ago

The fun part is the image detection models needs to be trained on a lot of porn to be able to identity and filter for porn.

[–] bane_killgrind@kbin.social 3 points 7 months ago

Prior to launch, we agreed to a comprehensive set of rules to govern image creation, including that people in images be fully clothed.

Apparently they thought about it, but neglected to think that some "vacation" images in the training data might not be tagged with the clothing worn or that the model might sometimes consider only pants to be fully clothed because some of the training data might show topless women in public and not be tagged. Or topless men.

Why wouldn't they just generate a couple hundred images and manually review them? It's pretty easy to automate putting someone's face onto an existing image, so that should be totally fine.

They could cycle the images every so often with the insane amounts of money the lottery generates.

That's hilarious.

[–] pete_the_cat@lemmy.world 6 points 7 months ago

This is what happens when companies use AI for dumb shit.

[–] TypicalHog@lemm.ee 5 points 7 months ago

Imagine playing lottery...

[–] downpunxx@fedia.io 5 points 7 months ago
[–] boatsnhos931@lemmy.world -3 points 7 months ago

Ahh I love technology.. however I need the uncensored image to investigate further...in private..for 30-45 seconds..