this post was submitted on 17 Jan 2024
669 points (98.3% liked)

Technology

72769 readers
1532 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] northendtrooper@lemmy.ca 19 points 2 years ago (4 children)

I'm finding that LLMs are doing a better job for searching for new things. If I have a question, instead of going to google or bing I'll goto chatGPT and ask some of that nature with some sources for further reading.

Never would I think that I would need to use AI to answer simple search and yet here we are because the sole purpose of a search engine doesn't really exist anymore.

[–] chaogomu@kbin.social 79 points 2 years ago (4 children)

The problem is, you can't trust ChatGPT to not lie to you.

And since generative AI is now being used all over the place, you just can't trust anything unless you know damn well that a human entered the info, and then that's a coin flip.

[–] lolcatnip@reddthat.com 19 points 2 years ago (1 children)

OTOH, you also can't trust humans not to lie to you.

[–] chaogomu@kbin.social 8 points 2 years ago

That's the coin flip.

[–] Lmaydev@programming.dev 9 points 2 years ago (1 children)

The newer ones search the internet and generate from the results not their training and provide sources.

So that's not such a worry now.

Anyone who used ChatGPT for information and not text generation was always using it wrong.

[–] BakerBagel@midwest.social 13 points 2 years ago (1 children)

Except people are using LLM to generate web pages on something to get clicks. Which means LLM's are training off of information generated by other LLM's. It's an ouroboros of fake information.

[–] Lmaydev@programming.dev 3 points 2 years ago* (last edited 2 years ago)

But again if you use LLMs ability to understand and generate text via a search engine that doesn't matter.

LLMs are not supposed to give factual answers. That's not their purpose at all.

[–] _number8_@lemmy.world 4 points 2 years ago (1 children)

plus search engines don't lecture me as much for typing naughty sex words

[–] Speculater@lemmy.world 2 points 2 years ago

Get on the unfiltered LLM train, they'll do anything GPT does and won't filter anything. Bonus if you run it locally and share with the community.

[–] notapantsday@feddit.de 1 points 2 years ago

However, I find it much easier to check if the given answer is correct, instead of having to find the answer myself.

[–] LWD@lemm.ee 22 points 2 years ago* (last edited 1 year ago) (1 children)
[–] Odelay42@lemmy.world 0 points 2 years ago (1 children)

How much do you think should it cost to use?

[–] LWD@lemm.ee 12 points 2 years ago* (last edited 1 year ago)
[–] Landless2029@lemmy.world 6 points 2 years ago

Oddly I prefer Bing because it'll cite the source!

[–] centof@lemm.ee 2 points 2 years ago

Perplexity is great for this. It gives like 5 links in addition to the text answer so it is imo the best of both worlds.