this post was submitted on 18 Oct 2024
782 points (98.4% liked)

Technology

59534 readers
3223 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] rsuri@lemmy.world 60 points 1 month ago* (last edited 1 month ago) (16 children)

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

This of course assumes 1) that cameras are just as good as eyes (they're not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn't. We tolerate crashes because we can't improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it's irrational not to do it.

[–] Revan343@lemmy.ca 8 points 1 month ago

Regarding point number 2, I have no doubt we'll be able to develop systems that process visual/video data as well as or better than people. I just know we aren't there yet, and Tesla certainly isn't.

I like to come at the argument from the other direction though; humans drive with eyesight because that's all we have. If I could be equipped with sonar or radar or lidar, of fucking course I'd use it, wouldn't you?

load more comments (15 replies)