1082
Google promised a better search experience — now it’s telling us to put glue on our pizza
(www.theverge.com)
This is a most excellent place for technology news and articles.
Every AI "answer" I've gotten from Google is factually incorrect, often ludicrously so.
Yep, same here. Whereas ChatGPT and Perplexity would tell me it didn't know the answer to my question, Bard/Gemini would confidently hallucinate some bullshit.
Really? Like what? I've always had ChatGPT give confident answers. I haven't tried to stump it with anything really technical though.
I asked about a plot point that I didn't understand in a TV series old enough to be in an LLM's knowledge. Chatgpt and Perplexity both said they couldn't find any discussions or explanations online for my particular question.
Bard/Gemini gave several explanations, all of them featuring characters, locations, and situations from the show, but confidently bullshit and definitely impossible in the story's world.