this post was submitted on 14 Feb 2024
481 points (98.6% liked)

Technology

59589 readers
2838 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Last year, two Waymo robotaxis in Phoenix "made contact" with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles' software. A "recall" in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn't pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn't elaborate on what it meant by saying that its robotaxis "made contact" with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren't carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to "persistent orientation mismatch" between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

you are viewing a single comment's thread
view the rest of the comments
[–] bstix@feddit.dk 70 points 9 months ago (27 children)

The company says the truck was being towed improperly

Shit happens on the road. It's still not a great idea to drive into it.

The company developed and validated a fix for its software to prevent similar incidents

So their plan is to fix one accident at a time..

[–] Chozo@kbin.social 12 points 9 months ago (3 children)

So their plan is to fix one accident at a time…

Well how else would you do it?

[–] bstix@feddit.dk 28 points 9 months ago (3 children)

You drive a car and can't quite figure out what is happening in front of you.

Do you:

  • A: Turn up the music and plow right through.
  • B: Slow down (potentially to a full stop) and assess the situation.
  • C : Slow down, close your eyes and continue driving slowly into the obstacle
  • D: Sound the horn and flash the lights

From the description offered in the article the car chose C, which is wrong.

[–] lengau@midwest.social 15 points 9 months ago (1 children)

Given the millions of global road deaths annually I think B is probably the least popular answer.

[–] Tetsuo@jlai.lu 0 points 9 months ago

Honestly slowing down too much can easily create an accident that didn't exist in the first place.

Not every situation can be handled by slowing down.

If that's the default behavior on high speed road this could be deadly for the car behind you.

[–] HeyThisIsntTheYMCA@lemmy.world 2 points 9 months ago

I mean that's machine learning for ya

[–] Chozo@kbin.social 1 points 9 months ago (1 children)

I wasn't asking about the car's logic algorithm; we all know that the SDC made an error, since it [checks notes] hit another car. We already know it didn't do the correct thing. I was asking how else you think the developers should be working on the software other than one thing at a time. That seemed like a weird criticism.

[–] bstix@feddit.dk 6 points 9 months ago

Sorry, I didn't answer your question. Consider the following instead:

Your self driving car has crashed into a god damn tow truck with a backwards facing truck.

Do you:

  • A: Program your car to deal differently with fucking backwards facing trucks on tow trucks
  • B: Go back to question one and make your self driving car pass a simple theory test.

According to the article the company has chosen A, which is wrong.

[–] Kecessa@sh.itjust.works 4 points 9 months ago

Radars > Don't hit stuff

[–] Turun@feddit.de 1 points 9 months ago* (last edited 9 months ago)

Ideally they don't need actual accidents to find errors, but discover said issues in QA and automated testing. Not hitting anything sounds like a manageable goal to be honest.

load more comments (23 replies)