this post was submitted on 22 Feb 2024
488 points (96.2% liked)

Technology

59605 readers
3438 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] Player2@lemm.ee 30 points 9 months ago (1 children)

There is a difference between having actually diverse data sources and secretly adding the word "diverse" to each image generation prompt

[–] Dayroom7485@lemmy.world -5 points 9 months ago (1 children)

Never claimed they had diverse data sources - they probably don’t.

My point is that that when minorities are underrepresented, which is the default case in GenAI, the (white, male) public tends to accept that.

I like that they tried to fix the issue of GenAI being racist and sexist. Even though the solution is obviously flawed: Better this than a racist model.

[–] StereoTrespasser@lemmy.world 7 points 9 months ago

I can't believe someone has to spell this out for you, but here we go: an accurate picture of people from an era in which there was no diversity will, by definition, not be diverse.