I like the comparison but LLMs can't go insane as they just word pattern engines. It's why I refuse to go along with the AI industry's insistance in calling it a "hallucination" when it spits out the wrong words. It literally can not have a false perception of reality because it does not perceive anything in the first place.
EpeeGnome
joined 7 months ago
This feels to me like a common folk saying from somewhere translated into English. It's also a very apt and appropriately vulgar metaphor for the situation.
Huh. That would explain why the exact same person reminded me of himself. Feel a bit silly for missing that.
I dislike cheating as much as the next person, but I do love a good outside the box solution like this. I don't recall what YouTube channel it was, but there was a guy who made an amazingly inelegant aimbot that worked by electrically stimulating the user's muscles using off the shelf TENS devices hooked up to a custom controller. He explained what it did and asked permission in chat to use it. The other players thought it was hilarious and agreed to let him do it.
We do understand exactly how LLMs work though, and it no way fits with any theories of consciousness. It's just a word extruder with a really good pattern matcher.