They'll work perfectly as soon as AI space data center robots go to Mars. I'd say a Robovan will be able to tow a roadster from New York to Hong Kong by... probably July. July or November at the latest.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Optical recognition is inferior and this is not surprising.
Yeah that's well known by now. However, safety through additional radar sensors costs money and they can't have that.
Nah, that one's on Elon just being a stubborn bitch and thinking he knows better than everybody else (as usual).

He's right in that if current AI models were genuinely intelligent in the way humans are then cameras would be enough to achieve at least human level driving skills. The problem of course is that AI models are not nearly at that level yet
Even if they were, would it not be better to give the car better senses?
Humans don't have LIDAR because we can't just hook something into a human's brain and have it work. If you can do that with a self-driving car, why cut it down to human senses?
I agree it would be better. I'm just saying that in theory cameras are all that would be required to achieve human level performance, so long as the AI was capable enough
I don't think it's necessarily about cost. They were removing sensors both before costs rose and supply became more limited with things like the tariffs.
Too many sensors also causes issues, adding more is not an easy fix. Sensor Fusion is a notoriously difficult part of robotics. It can help with edge cases and verification, but it can also exacerbate issues. Sensors will report different things at some point. Which one gets priority? Is a sensor failing or reporting inaccurate data? How do you determine what is inaccurate if the data is still within normal tolerances?
More on topic though... My question is why is the robotaxi accident rate different from the regular FSD rate? Ostensibly they should be nearly identical.
I'm not too sure it's about cost, it seems to be about Elon not wanting to admit he was wrong, as he made a big point of lidar being useless
just one more AI model, please, that’ll do it, just one more, just you wait, have you seen how fast things are improving? Just one more. Common, just one more…
They're 4 times as capable ~of~ ~crashing~ as a human driver. How efficient!
Are people getto g wttlements who are involved in a crash sounds like a potential payday with obviously risky odds.
Who insures these things?
Tesla
Darwin just getting ever more creative over time.
How often are they just bursting into flames for no reason?
4x as often as a human I'd expect
I didn't realise spontaneous human combustion was still so prevalent!
It happens all the time. Especially to drummers.
Spontaneous human combustion only occurs if the human is also carrying a Galaxy Note 7 LOL
My S7 battery pillowed on me. Didn't burst into flames thankfully, lol.


I mean, people are dying. Including the people who didn't pay for it. So, kind of a bigger deal than that.