this post was submitted on 28 Jan 2026
1136 points (99.0% liked)

Technology

79355 readers
4201 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gerryflap@feddit.nl 1 points 10 hours ago (2 children)

For some issues, especially related to programming and Linux, I feel like I kinda have to at this point. Google seems to have become useless, and DDG was never great to begin with but is arguably better than Google now. I've had some very obscure issues that I spent quite some time searching for, only to drop it into ChatGPT and get a link to some random forum post that discusses it. The biggest one was a Linux kernel regression that was posted on the same day in the Arch Linux forums somewhere. Despite having a hunch about what it could be and searching/struggling for over an hour, I couldn't find anything. ChatGPT then managed to link me the post (and a suggested fix: switching to LTS kernel) in less than minute.

For general purpose search tho, hell no. If I want to know factual data that's easy to find I'll rely on the good old search engine. And even if I have to use an LLM, I don't really trust it unless it gives me links to the information or I can verify that what it says is true.

[–] A_norny_mousse@feddit.org 6 points 8 hours ago (2 children)

programming and Linux

I'm seeing almost daily the fuck-ups resulting from somebody trying to fix something with ChatGPT, then coming to the forums because it didn't work.

[–] NewNewAugustEast@lemmy.zip 1 points 4 hours ago

I agree that happens, but it has nothing to do with what op said. They didn't want a solution, they wanted a link to where the problem was being discussed so they could work out a solution.

People seem to really confure the difference between asking an llm how to patch a boat vs where did people discuss ways to patch a boat.

[–] Honytawk@feddit.nl 1 points 4 hours ago

Most likely because if they came directly with their problem to whatever platform you are on, they would have been scolded at for not trying hard enough to solve it on their own. Or close the post because it has already been asked.

[–] Cherry@piefed.social 2 points 9 hours ago

Yup this is a great example. LLM for non opinion based stuff or for stuff that’s not essential for life. It’s great for finding a recipe but if you’re gonna rely on the internet or an LLM to help you form an opinion on something that requires objective thinking then no. If I said hey internet/LLM is humour good or bad, it would insert a swayed view.

It simply can’t be trusted. I can’t even trust it return shopping links so I have retreated back to real life. If it can’t play fair I no longer use it as a tool.