this post was submitted on 25 May 2024
819 points (97.7% liked)
Technology
59569 readers
3825 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can we swap out the word "hallucinations" for the word "bullshit"?
I think all AI/LLM stuf should be prefaced as "someone down the pub said..."
So, "someone down the pub said you can eat rocks" or, "someone down the pub said you should put glue on your pizza".
Hallucinations are cool, shit like this is worthless.
Google search isnt a hallucination now though.
It instead proves that LLMs just reproduce from the model they are supplied with. For example, the "glue on pizza" comment is from a reddit user called FuckSmith roughly 11 years ago.
What do you mean by that? This isn't some secret but literally how LLMs work. lol What people mean by hallucinating is when LLMs "create" facts that aren't any. Be it this genius recipe of glue pizza, or any other wild combination of its model's source material. The whole cooking thing is a great analogy actually because it's like all of their fed information are the ingredients, and it just spits out various recipes based on those ingredients, without any guarantee that it is actually edible.
Yeah, but John Q. Public reads AI and thinks HAL 9000 and Skynet, and no additional will convince them otherwise.