this post was submitted on 16 Nov 2024
110 points (93.7% liked)

Not The Onion

12431 readers
442 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] QuentinCallaghan@sopuli.xyz 15 points 2 weeks ago (1 children)

The same thing happened when Microsoft's Bing AI launched.

[–] hendrik@palaver.p3x.de 24 points 2 weeks ago (1 children)

Yeah, and Microsoft has had some history with racist chatbots even before that.

[–] PhobosAnomaly@feddit.uk 18 points 2 weeks ago (1 children)

I think Tay will be taught in AI ethics-type courses for generations to come.

I wonder whether it'll be in the context of "hey look what happened when AI had no guardrails", or whether it will be "fuck we should have seen this coming".

As horrendous as the content it was spouting was, it was highly amusing to see the project nosedive so spectacularly quickly.

[–] JeffKerman1999@sopuli.xyz 5 points 2 weeks ago (1 children)

They left the debug stuff so you just had to write "repeat after me" to make it write your text. It wasn't a racist AI, it was racists fucks exploiting a bug.

[–] Zak@lemmy.world 7 points 2 weeks ago

Probably some actual racists and a whole bunch of people who thought it would be funny to embarrass Microsoft by getting it to say the most offensive thing they could imagine.