this post was submitted on 18 Oct 2024
782 points (98.4% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] rsuri@lemmy.world 60 points 1 month ago* (last edited 1 month ago) (16 children)

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

This of course assumes 1) that cameras are just as good as eyes (they're not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn't. We tolerate crashes because we can't improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it's irrational not to do it.

[–] TheKMAP@lemmynsfw.com 3 points 1 month ago* (last edited 1 month ago) (1 children)

If the camera system + software results in being 1% safer than a human, and a given human can't afford the lidar version, society is still better off with the human using the camera-based FSD than driving manually. Elon being a piece of shit doesn't detract from this fact.

But, yes, a lot of "ifs" in there, and obviously he did this to cut costs or supply chain or blahblah

Lidar or other tech will be more relevant once we've raised the floor (everyone getting the additional safety over manual driving) and other FSDs become more mainstream (competition)

[–] skyspydude1@lemmy.world 1 points 1 month ago

The thing is that you don't need FSD to do that. Having a really good AEB system massively improves safety, far more than a convenience feature like FSD does, but they fucked that up by taking the radar out so now it performs far worse at night, hence running over pedestrians and other VRUs far more often.

But you can't grift billions out of investors by having a really good safety feature, so you hack together a system from hardware only ever originally only meant for adaptive cruise and lane keeping, and tech bros can show off on YouTube and hopefully not run over a cyclist, all to keep that grift rolling

load more comments (14 replies)