this post was submitted on 24 Jul 2025
282 points (98.6% liked)

Technology

73240 readers
3957 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.

Q1 2025 was 2.5% worse than Q1 2024.

Q2 2025 was 2.8% worse than Q2 2024.

Not a great look.

you are viewing a single comment's thread
view the rest of the comments
[–] FishFace@lemmy.world 10 points 3 days ago (2 children)

End-to-end ML can be much better than hybrid (or fully rules-based) systems. But there's no guarantee and you have to actually measure the difference to be sure.

For safety-critical systems, I would also not want to commit fully to an e2e system because the worse explainability means it's much harder to be confident that there is no strange failure mode that you haven't spotted but may be, or may become, unacceptable common. In that case, you would want to be able to revert to a rules-based fallaback that may once have looked worse-performing but which has turned out to be better. That means that you can't just delete and stop maintaining that rules-based code if you have any type of long-term thinking. Hmm.

[–] xthexder@l.sw0.com 2 points 2 days ago

I've thought about it in the past... what if there was a bug in an update and under some specific conditions the car will just vere to the side and crash. There's a possibility that every self-driving Tesla travelling west into a sunset suddenly slams on the brakes causing a pile up. Who knows what kind of edge cases could exist?
Even worse, what if someone hacks the wireless update and does something like this intentionally?

[–] match@pawb.social 2 points 3 days ago

yeah i wanna see what the fuck metrics made them think this was a good idea. what is their mean average precision. did they recall@1 for humans on the road