this post was submitted on 05 Sep 2024
925 points (96.7% liked)
Technology
59963 readers
3481 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Here is what kagi delivers with the same prompt:
NB: quick answer is only generated when ending your search with a question mark
Problem is, you cannot trust it's not hallucinating these stats
And even if it's showing the correct number, you can't be sure how trustworthy the source is.
This applies to any information though, it’s got nothing to do with LLMs specifically.
Not really, no. Sources of infornation gain a reputation as time goes on. So, even though you should still check with multiple sources, you can sort of know if a certain bit of information is likely to be correct or not.
On the other hand, LLM's will quote different sources and sometimes it will only provide them if you ask it to. Even then it can hallucinate and quote a source that doesn't actually exist, so there's that as well.
At least it's citing sources and you can check to make sure. And from my anecdotal evidence it has been pretty good so far. It also told me on some occasions that the queried information was not found in it's sources instead of just making something up. But it's not perfect for sure, it's always better to do manual research but for a first impression and to find some entry points I've found it useful so far
The problem is that you need to check those sources today make sure it's not just making up bullshit and at that point you didn't gain anything from the genai
As I said the links provide some entry points for further research. It's providing some use to me because I don't need to check every search result. But to each their own and I understand the general scepticism of generative "AI"
If you don't check everyone source. It might be just bullshitting you. There's people who followed your approach and got into hot shit with their bosses and judges
There is absolutely value in something compiling sources for you to personally review. Anyone who cannot use AI efficiently is analogous to someone who can't see the utility in a graphing calculator. It's not magic, it's a tool. And tools need to be used precisely, and for appropriate purposes.
My plumber fucks up I don't blame his wrench. My lawyers don't vet their case work, I blame them.
It's an LLM. Odds are it's hallucinating the sources and they don't even exist.
Know what does compile sources for you which are guaranteed to exist and be related to what you're looking for..? A good old not LLM infected search engine.
If my plumber replaces their wrench for a rabid gerbil claiming it'll be just as good I'm definitely changing plumbers.
Spoken like someone who never even tried to use an LLM and just parrots the bad things they hear online.
Lemmy is full of LLM haters, I get where they're coming from but they take it to the extreme every single time.
I'm not an llm hater. I run one of the biggest Foss genai services. It's because of that that I know their limitations.
You said that you're not going to check every search result, which implied you're not checking every time source either which will lead you to eventually believe some llm bullshit. And of you're using an llm just to compile sources that you check with yourself it's no difference than a search engine without llm
I'm a different person then who you were replying to, I know that LLMs do hallucinate/lie/make up sources, but you can quickly check if the sources are real
The problem is that even if the sources are real, it's no guarantee that the LLM even used them
I guess? I would also guess your genAI is more of the NSFW variety then an LLM given your alt and what I've seen federated from your server xD
So I definitely treat LLMs like enhanced search, have gotten wrong info but it pointed me in the right direction, in those instances it's still helpful as long as you know it very well will provide confidently inaccurate info
my alt? what alt?
I assumed leftzero@lemmynsfw was you? Since you replied to my reply to them as if it was you.
Maybe not tho lol Idk, both have zero in the name so it made sense xD
Nah but it's a discussion that spawned from my reply so I'm invested 😁
No, I'm not db0, but thanks for the compliment. 😅
I'm way too lazy (and broke) to set up and admin a Lemmy instance.
The sources are the same result of the search? Or at least the top results?
When I query an AI I always end with "provide sources and bibliography for your reply". That seems to get better replies.
That being said, I can't trust MKBHD is not hallucinating either.