this post was submitted on 31 Jul 2024
285 points (96.4% liked)
Technology
59534 readers
3197 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hallucinating is a fancy term for BEING WRONG.
Unreliable bullshit generator is still unreliable. Imagine that!
AI doesn't know what's wrong or correct. It hallucinates every answer. It's up to the supervisor to determine whether it's wrong or correct.
Mathematically verifying the correctness of these algorithms is a hard problem. It's intentional and the trade-off for the incredible efficiency.
Besides, it can only "know" what it has been trained on. It shouldn't be suprising that it cannot answer about the Trump shooting. Anyone who thinks otherwise simply doesn't know how to use these models.
It is impossible to mathematically determine if something is correct. Literally impossible.
At best the most popular answer, even if it is narrowed down to reliable sources, is what it can spit out. Even that isn't the same thing is consensus, because AI is not intelligent.
If the 'supervisor' has to determine if it is right and wrong, what is the point of AI as a source of knowledge?
We should understand that 99.9% of what wee say and think and believe is what feels good to us and we then rationalize using very faulty reasoning, and that's only when really challenged! You know how I came up with these words? I hallucinated them. It's just a guided hallucination. People with certain mental illnesses are less guided by their senses. We aren't magic and I don't get why it is so hard for humans to accept how any individual is nearly useless for figuring anything out. We have to work as agents too, so why do we expect an early days LLM to be perfect? It's so odd to me. Computer is trying to understand our made up bullshit. A logic machine trying to comprehend bullshit. It is amazing it even appears to understand anything at all.
The the word hallucination means literally anything you want it to. Cool, cool. Very valiant of you.