This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Elon took the wheel because that person made a mean tweet about him
“Kill me” it said in a robotic voice that got slower, glitchier, and deeper as it drove off the road.
EXTERMINAAAAAATE!!!
The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.
What I don't get is how this false advertising for years hasn't caused Tesla bankruptcy already?
Because the US is an insane country where you can straight up just break the law and as long as you're rich enough you don't even get a slap on the wrist. If some small startup had done the same thing they'd have been shut down.
What I don't get is why teslas aren't banned all over the world for being so fundamentally unsafe.
What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.
I've argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.
Well, because 99% of the time, it's fairly decent. That 1%'ll getchya tho.
To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.
Many Tesla owners are definitely dead many times, on the inside.
Someone who doesn't understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.
You are trying to judge the self driving feature in a vacuum. And you can't do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn't need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that "bit better" than you statistically.
FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.
FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.
Yeah people keep bringing that up as a counter arguement but I'm pretty certain humans don't swerve off a perfectly straight road into a tree all that often.
So unless you have numbers to suggest that humans are less safe than FSD then you're being equally obtuse.
What is the failure rate? Unless you know that you can’t make that claim.
..It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn't even time for human intervention, but I frequently had to take over when I used to use it (post v13)
That's probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.
Let's say that it's only 0.01% risk, that's still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.
It wouldn't be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they're never going to add lidar scanners so is literally never going to get any better it's always going to be this bad.
Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.
The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.
Your saying this on a video where it drove into a tree and flipped over. There isn't time for a human to react, that's like saying we don't need emergency stops on chainsaws, the operator needs to just not drop it.
To be fair, that grey tree trunk looked a lot like a road
It's fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don't stop for pedestrians or drive off a cliff. So freaking what, that's the price for progress my friend!
I'd like to think this is unnecessary but just in case here's a /s for y'all.
GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless
It got the most recent update, and thought a tunnel was a wall.
... and a tree was a painting.
I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s
Why would you inflict that guy on a poor innocent kitty?