this post was submitted on 25 Feb 2026
15 points (61.9% liked)
Technology
81907 readers
5040 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And a "hallucination" is also an inaccurate humanization of the actual meaning: "statistical relationship that we AI folks don't like."
"Hallucinations" even include accurate data.
It is a trash marketing buzzword.
did you know that there is no sex going on in a Breeder Reactor?
https://en.wikipedia.org/wiki/Breeder_reactor
They're analogies to help us communicate ideas.
A breeder reactor is creating something, which is like the outcome of breeding. That name fits.
a hallucination is seeing something that's not there, which also fits.
In AI, a "hallucination" is just as much "there" as a non-"hallucination." It's a way for scientists to stomp their foot and say that the wrong output is the computer's fault and not a natural consequence of how LLMs work.
Hallucinations requires perception. LLMs are just statistical models and do not have perceptions.
It was a cute name early on, now it is used to deflect when the output is just plain wrong.