this post was submitted on 15 May 2025
305 points (98.4% liked)
Technology
70044 readers
3915 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The word hallucination has zero implication of intent whatsoever. Last time I checked hallucination is an entirely involuntary experience, regardless of the context the word is used in.
They are called hallucination in computer science not “to romanticize” it. It is called that because the output is totally random from the perspective of the input. If there is no logical path from input to the output, it is similar to a human hallucinating. Human sees no actual weird visual stimuli that results in them hallucinating a dragon, therefore the input info from their eyes has no bearing on what they imagine is actually there.
This is different from “fabrication” in that the AI intentionally creating fake info based on your input request would not be a hallucination, because there would be a relationship between input and output.
While you say you prefer “fabrication”, the word fabrication actually implies some intent that is absent from what we are referring to as AI hallucinations
I meant that fabrication doesn't imply intent as "lies" would.
It seems like you use the hallucinations term correctly, when output has no relation to input.
In this case, as in many numerous others, the Ai took input of "cite a source" and did as output cite a source as requested, but invented the content of the source. It fabricated, which means to make up, create.
Fabricate does not imply intent to deceive, where lie does.
I will agree that if the output is purely unrelated to the input, hallucination is still fine, but is absolutely a romanticized term when we're referring to this computer generated code... It's literally personification.
Everything an LLM outputs is hallucinated. That's how it works. Sometimes the hallucination matches reality, sometimes it doesn't.