this post was submitted on 18 Oct 2025
589 points (97.3% liked)

Technology

76171 readers
3412 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Fedizen@lemmy.world 7 points 1 hour ago (1 children)

Intentionally breaking laws sounds like something that should be prosecuted.

[–] innermachine@lemmy.world 3 points 31 minutes ago (1 children)

It will always fall on the driver of the vehicle, as it should. I don't care how self driving your car is, it has a steering wheel, an accelerator pedal, and a brake pedal and in the driver seat YOU are responsible with how you operate your vehicle. If u decide to trust a self driving feature that's YOUR mistake. I would love to blame all these crashes on Tesla but the reality is that all these drivers aids and self driving cars having accidents is proof that you should be the one in control of your own vehicle. No crying about how the automotive nannies didn't stop you from crashing the vehicle your driving, take responsibility. Don't like it? Don't trust the "self driving" nonsense (read: glorified advanced cruise control). Now one thing I don't agree with is advertising as self driving, and I strongly believe self driving vehicles in public roadways should be ILLEGAL!

[–] Fedizen@lemmy.world 2 points 10 minutes ago

Except the problem here is Tesla is lying about a product to encourage people to use it illegally and unsafely. At some point there's extra deaths to blame solely on tesla's lies.

[–] cupcakezealot@piefed.blahaj.zone 21 points 17 hours ago

states just need to ban teslas from public road

[–] TwinTitans@lemmy.world 21 points 21 hours ago (1 children)

Next will be teslas without airbags. Because why not.

[–] rumba@lemmy.zip 1 points 7 minutes ago

TBF, if they're going to lock you in a burning car unable to exit, killing you with the dashboard or sterring column would be a mercy killing compared to burning to death inside.

replace the airbag with a letal injection perhaps?

/s

[–] PalmTreeIsBestTree@lemmy.world 46 points 1 day ago (1 children)

Adaptive cruise control is good enough for most people and has been a proven technology for 20+ years. FSD is just downright dangerous.

[–] humorlessrepost@lemmy.world 14 points 23 hours ago (4 children)

I like my car’s version that’s just adaptive cruise control combined with using lidar maps of major roads to do lane-centering. I can go on a road trip and not touch the gas, break, or steering wheel for hours, but I have to drive it myself through residential neighborhoods.

load more comments (4 replies)
[–] firewyre@lemmy.world 65 points 1 day ago (3 children)

Why do we continue to allow this company to exist and break the law?

[–] Jeremyward@lemmy.world 31 points 1 day ago
[–] cley_faye@lemmy.world 28 points 1 day ago (1 children)
[–] Salamanderwizard@lemmy.world 10 points 22 hours ago

By these powers combined...

load more comments (1 replies)
[–] Mwa@thelemmy.club 11 points 21 hours ago

the incompetence of this guy holy.

[–] webdox@lemmy.world 48 points 1 day ago (1 children)

He's been real quiet lately. No more talks of release the list or the America Party. Did his K plug go on vacation?

[–] Zron@lemmy.world 23 points 1 day ago (3 children)

Now trump is black bagging US citizens in broad daylight, and Musk is an immigrant who fully admitted that he originally entered the country illegally. The danger must have creeped its way through his ketamine addled brain.

[–] architect@thelemmy.club 6 points 2 hours ago

I can promise you he doesn’t even think about that.

He’s busy with his little Internet cult right now and I’m assuming some other malicious bullshit to fuck us with.

[–] DicJacobus@lemmy.world 10 points 1 day ago

Musk also has a private army of security contractors (and someone like him probably has mercenaries off killing people in other countries too)

he's far too much trouble to go after, if you're DHS, regardless if you're Trump DHS or Democrat DHS. someone like that is ungovernable.

load more comments (1 replies)
[–] mhague@lemmy.world 26 points 1 day ago (6 children)

Need jammers to confuse and break Teslas. They're weapons designed to break laws and protect occupants at the expense of bystanders. Can't be mad if a bystander redirects your Tesla into a ditch.

[–] dogs0n@sh.itjust.works 18 points 1 day ago (1 children)

protect occupants

It doesn't even do that. You crash a tesla and start a fire, it will glady lock you in the car.

[–] modus@lemmy.world 4 points 21 hours ago

Can't have too many witnesses after all.

load more comments (5 replies)
[–] kureta@lemmy.ml 62 points 1 day ago

They finally did it. They automated crime 👍

[–] MummysLittleBloodSlut@lemmy.blahaj.zone 100 points 1 day ago (11 children)

When a self driving car breaks the law, the CEO should get the demerit points on their own licence, and if they lose their licence, the cars can't drive anymore.

[–] thepompe@ttrpg.network 2 points 1 hour ago

Man, holding the company financially accountable for all traffic violations would be magnificent.

It's a shame we're too stupid/weak to pull it off.

[–] dan1101@lemmy.world 11 points 1 day ago (1 children)

Somebody needs to be responsible, otherwise ban self driving until someone figures it out. Impound the vehicle if need be.

load more comments (1 replies)
load more comments (9 replies)
load more comments
view more: next ›