this post was submitted on 05 Feb 2025
174 points (82.0% liked)
Technology
61548 readers
4172 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"You're holding it wrong"
This but actually. Don't use an LLM to do things LLMs are known to not be good at. As tools various companies would do good to list out specifically what they're bad at to eliminate requiring background knowledge before even using them, not unlike needing to somehow know that one corner of those old iPhones was an antenna and to not bridge it.
Yup, the problem with that iPhone (4?) wasn't that it sucked, but that it had limitations. You could just put a case on it and the problem goes away.
LLMs are pretty good at a number of tasks, and they're also pretty bad at a number of tasks. They're pretty good at summarizing, but don't trust the summary to be accurate, just to give you a decent idea of what something is about. They're pretty good at generating code, just don't trust the code to be perfect.
You wouldn't use a chainsaw to build a table, but it's pretty good at making big things into small things, and cleaning up the details later with a more refined tool is the way to go.
That is called being terrible at summarizing.
That depends on how you use it. If you need the information from an article, but don't want to read it, I agree, an LLM is probably the wrong tool. If you have several articles and want go decide which one has the information you need, an LLM is a pretty good option.
I think there's a fundamental difference between someone saying "you're holding your phone wrong, of course you're not getting a signal" to millions of people and someone saying "LLMs aren't good at that task you're asking it to perform, but they are good for XYZ."
If someone is using a hammer to cut down a tree, they're going to have a bad time. A hammer is not a useful tool for that job.