this post was submitted on 17 Feb 2026
435 points (98.2% liked)

Technology

81373 readers
4152 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Dutch lawyers increasingly have to convince clients that they can’t rely on AI-generated legal advice because chatbots are often inaccurate, the Financieele Dagblad (FD) found when speaking to several lawfirms. A recent survey by Deloitte showed that 60 percent of lawfirms see clients trying to perform simple legal tasks with AI tools, hoping to achieve a faster turnaround or lower fees.

you are viewing a single comment's thread
view the rest of the comments
[–] WeavingSpider@lemmy.world 3 points 7 hours ago (1 children)

I understand what you mean, but... looks at Birgenair 301 and Aeroperu 603 looks at Qantas 72 looks at the 737 Max 8 crashes Planes have spat out false data, and in of the 5 cases mentioned, only one avoided disaster.

It is down to the humans in the cockpits to filter through the data and know what can be trusted. Which could be similar to LLMs except cockpits have a two person team to catch errors and keep things safe.

[–] ToTheGraveMyLove@sh.itjust.works 3 points 6 hours ago (1 children)

So you found five examples in the history of human aviation, how often do you think AI hallucinates information? Because I can guarantee you its a hell of a lot more frequently than that.

[–] WeavingSpider@lemmy.world 2 points 6 hours ago (1 children)

You should check out Air Crash Investigation, amigo, all 26 seasons, you'd be surprised what humans in metal life support machines can cause when systems breakdown.

I'm not watching 26 seasons of a TV show ffs, I've got better things to do with my time. Skimming the IMBD though, I'm seeing a lot of different causes for the crashes, from bad weather, to machine failure, to running out of fuel, improper maintenance, pilot errors, etc. Remember, my point had nothing to do with mechanical failure. Any machine can fail. My point was that airplanes don't routinely spit out false information in the day-to-day function of the machine like AI does. You're getting into strawman territory mate.