this post was submitted on 31 Dec 2023
464 points (99.2% liked)

Not The Onion

12344 readers
803 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
 

Michael Cohen, the former lawyer for Donald Trump, admitted to citing fake, AI-generated court cases in a legal document that wound up in front of a federal judge, as reported earlier by The New York Times. A filing unsealed on Friday says Cohen used Google’s Bard to perform research after mistaking it for “a super-charged search engine” rather than an AI chatbot.

I... don't even. I lack the words.

you are viewing a single comment's thread
view the rest of the comments
[–] rsuri@lemmy.world 18 points 10 months ago* (last edited 10 months ago)

The problem is breathless AI news stories have made people misunderstand LLMs. The capabilities tend to get a lot of attention, but not so much for the limitations.

And one important limitation of LLM's: they're really bad at being exactly right, while being really good at looking right. So if you ask it to do an arithmetic problem you can't do in your head, it'll give you an answer that looks right. But if you check it with a calculator, you find the only thing right about the answer is how it sounds.

So if you use it to find cases, it's gonna be really good at finding cases that look exactly like what you need. The only problem is, they're not exactly what you need, because they're not real cases.