this post was submitted on 27 Apr 2026
924 points (99.0% liked)

Technology

84166 readers
2451 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] fum@lemmy.world 5 points 35 minutes ago

This is absolutely hilarious. "AI" users getting what they deserve chef's kiss

[–] Wispy2891@lemmy.world 5 points 1 hour ago

To me it seems more criminal that the cloud provider has a "nuclear button" feature via the API that destroys everything including the backups with a single call and no confirmation whatsoever. What if the key gets accidentally leaked and someone wants to have fun?

[–] ZILtoid1991@lemmy.world 9 points 1 hour ago (1 children)

Always keep offline backup copies of your important data regardless of using AI slop to look over it! No, I don't care that "optical media is obsolete and e-waste!", or that "tapes are a 100 year old obsolete technology compared to cheap SSDs from TEMU!".

[–] PolarKraken@lemmy.dbzer0.com 4 points 1 hour ago (2 children)

Optical media? Is that a viable part of backup strategies? I would expect tapes for sure, sounds like you know more than me.

[–] ZILtoid1991@lemmy.world 4 points 51 minutes ago
  1. Better than not having an offline copy.
  2. Write-only, ransomware cannot delete/encrypt it.
  3. Drives are still cheap.

Downside is having techbros talk you about laser rot, how internal drives are obstructing the optimal airflow in GAMING PC cases, and how Gabe Newell is based and stuff.

[–] katze@lemmy.4d2.org 4 points 1 hour ago

A quality disc can last 10 years or more. At a company I used to work at the backups were burned to discs coated with gold. They had 15 year old discs that still worked.

[–] pimpampoom@lemmy.zip 10 points 2 hours ago (1 children)

Did they write that title with AI also? Look terrible

[–] flightyhobler@lemmy.world 1 points 28 minutes ago

Not to mention the image

[–] prodaccess@lemmy.world 6 points 2 hours ago (2 children)

Like all interesting outages, there are probably multiple key action items.

I'm also curious why deleting a "staging volume" would affect prod. I don't know Railway, but it seems like a bad architectural design.

[–] PolarKraken@lemmy.dbzer0.com 2 points 1 hour ago* (last edited 1 hour ago)

Sounds like a responsible strategy to draw back from a lot of this. It's all so...effervescently remade, the "ecosystem", every few months.

For me the takeaway comes from time I spent in some safety-critical parts of engineering and personal hobbies. Ultimately relying on people to make good decisions ~all of the time isn't enough to prevent disaster, if something like disaster is on the line.

Systems must be engineered to remove possibilities for accidentally bad, in-the-moment human decisions, where it counts. Thoughtfully. This is the weird same-shape but exactly-opposite doppelganger of that set of best practices.

When the systems are using ~opaque automations that behave like humans (w.r.t. some decision-making and unreliable expectations of behavior) - and then relying on people making the right calls on top of that ever-shifting set of capabilities - I mean c'mon lol.

This is gonna happen a lot, while the carrot of go-faster remains dangling so unignorably (because it's in front of everyone, everyone working anywhere near the stuff). Until we look around and take a broader view. Which will be learned the same way we learned to make safety regulations, but I largely doubt our ability to respond in a similar way.

The money will eventually respond, of course, but that's always a poor and late proxy for what ought to be done.


Sidenote, for aspiring engineers, take heart!

It will be you who ends up tasked with unburying from all the technical debt incurred, truly. A practice steeped in the ancient wizardly traditions of yore. Spending a career on that and building something better.

It will be necessary, the work begins roughly a while ago lol but more fully when things settle somewhat. Many large and slow organizations are right now very engaged in simply unburying themselves from the technical debt of a previous hype cycle, AKA now making use of all the data they collected (badly, via go-fast charlatans) during the "Big Data! You'll be left behind if you don't collect extreme amounts of data, it's cheapish and everyone else is doing it!" era.

[–] HakunaHafada@lemmy.dbzer0.com 2 points 2 hours ago

Just because it's not a best practice doesn't mean it's not being done.

[–] GreenKnight23@lemmy.world 16 points 3 hours ago
[–] skisnow@lemmy.ca 3 points 2 hours ago* (last edited 1 hour ago)

Wow there's a lot of people in this thread defending the LLM. “They just didn’t set it up right” gtfo

[–] Jaysyn@lemmy.world 12 points 4 hours ago (1 children)
[–] sundray@lemmus.org 1 points 1 hour ago

"If your prod can be deleted by your AI, it should be."

[–] BlackLaZoR@lemmy.world 9 points 5 hours ago

Learning from mistakes of people dumber than you isn't a thing these days. Prepare for one AI disaster after another

[–] WhatsHerBucket@lemmy.world 56 points 7 hours ago (2 children)

"That's ok, it will be great in robots with lethal weapons. What could go wrong? It'll be the greatest killing machine, like you've never seen before". 🫲 🍊 🫱

[–] _g_be@lemmy.world 11 points 5 hours ago

Incredible emoji

[–] Napster153@lemmy.world 2 points 5 hours ago (1 children)

Can we make sure to make Ted Farro suffers worse this time?

Being reduced to a mutant blob for, say, a few extra thousand years and maybe put in a zoo or something?

[–] Pman@lemmy.org 2 points 4 hours ago

Nah but that's what he wanted, he is the truest form of tech bro, destroy the world, refuse to accept consequences of his actions, weaseled his way out of the situation and managed to, in the wake of unimaginable human suffering, get more power over people and has a god complex tell me this isn't some or all the characteristics of people like Peter Theil, Elon Musk, Mark Zuckerberg, Sundar Pichai, Bill Gates, hell even Tim Cook and Steve Jobs before him. Punishment doesn't stop this sort of behavior but removing the possibility of someone having that level of control over others is the only way but the richest and most powerful have always sought ways of amassing more power not realizing that that leads to worse off situations for everyone including themselves, Horizon did great encapsulating that trait in Faro, but be it him, the people behind Skynet, the Matrix or whatever other tech dystopia that tech bros seem pathologically unable to not try to make happen in the worst way possible is only the beginning, they seem to forget that even with advanced tech that serves their needs and wants, which won't help their mental health, the people lower down on the rungs of society have brains, wants and needs, and they have more expertise in all sorts of things than the 1% are except for mass exploitation. This inevitably goes wrong one of a few ways, either everyone dies from the tech, or so many that societal collapse is inevitable not great and even if society survives it can't functionally reconstitute itself; 2 they win and kill off or supress enough of society that the society becomes less productive and instead of fighting the powerful they flee or don't participate in wealth generating for the rich were they don't have to, maybe to rise up again later or the economy of the region just ignores them completely and the government protects themselves from their people more than anything else, or 3rd your revolution with terror campaigns against any and all who can be credibly accused of being part of the former tyrants. In all 3 cases the richer people end up poorer overall because wealth flees or dies in autocracy.

[–] dbtng@eviltoast.org 11 points 6 hours ago (1 children)

3-2-1
Its really common for companies to not have an offsite backup. My own employer only offsites the customer data, not our core biz stuff. And I setup the offsite replication. It did not exist until I built it. (Proxmox Backup Server is tha best!)

[–] ClownStatue@piefed.social 2 points 3 hours ago

Seems like, if nothing else, Ai might finally force corporate accountants to acknowledge that the cost of a good backup strategy far outweighs the cost of losing all your data because some MBA thought he could write a product update himself with Claude code.

[–] percent@infosec.pub 29 points 8 hours ago (2 children)

Seems like they were operating with a pile of bad practices, then threw AI into the mix.

Neural networks are approximation algorithms. There's a reason LLMs are generally more productive with statically typed languages, TDD, etc. They need those feedback loops and guard rails, or they'll just carry on as if assuming they never make mistakes (which tends to have a compounding effect).

If you want to use AI safely, you should be more defensive about it. It will fuck up; plan accordingly.

[–] Kage520@lemmy.world 12 points 7 hours ago (5 children)

There really should be a certification course for using AI safely. I'm slop coding a hobby app and I'm shocked at how much it FEELS like it can do, because it can do amazing things, yet fails in the strangest ways. When it feels like it can get away with it, it forgets earlier discussions and moves on without it. So you can spend time hammering out a whole section of code, then move on, and AI will rip out everything that references that code and think of a different way in the moment and code that in instead. It won't be the same. It probably won't work, or at least won't pass all test cases. But if you aren't paying attention and keep coding, your original part of the project is no longer functioning and you won't understand why. But every step of the way it's confident in its answers and you won't suspect that it fundamentally no longer understands the project.

[–] ExFed@programming.dev 6 points 6 hours ago* (last edited 6 hours ago) (1 children)

As someone who started writing software over 20 years ago (yikes I feel old), I feel like a lot of the best practices I've come to appreciate are really just strategies for mitigating future pain or boring/uninspiring work. When you eliminate most of the cost of rewriting everything from scratch by a machine that feels nothing, then "best practices" kinda lose their meaning.

Edit: confusing sentence order.

load more comments (1 replies)
load more comments (4 replies)
load more comments (1 replies)
[–] sturmblast@lemmy.world 6 points 6 hours ago (1 children)

It's gonna take your job... uh huh..

[–] droopy4096@lemmy.ca 2 points 2 hours ago

it actually will take your job... it will crash industries and economies a touch later, when all the knowledge and expertise is lost and we're left with a choice of rolling back a century or keep living within enshitified society. And not because AI is useless, but because suits and wallstreet are casing exponential profits from the snake oil AI outfits are selling.

[–] LordCrom@lemmy.world 36 points 9 hours ago (1 children)

This was the exact plot of Silicon Valley when Son of Anton deleted the entire codebase as the most efficient way to remove bugs.

load more comments (1 replies)
[–] Fmstrat@lemmy.world 76 points 10 hours ago (2 children)

This guy.

The PocketOS boss puts greater blame on Railway’s architecture than on the deranged AI agent for the database’s irretrievable destruction. Briefly, the cloud provider's API allows for destructive action without confirmation, it stores backups on the same volume as the source data, and “wiping a volume deletes all backups.” Crane also points out that CLI tokens have blanket permissions across environments.

Oh look, they have project level tokens: https://docs.railway.com/integrations/api#project-token

They chose to give it full account access, including to production. But ohhhh nooooo it's not MYYYY fault!

[–] chronicledmonocle@lemmy.world 64 points 10 hours ago (12 children)

Also backups stored on the SAME VOLUME as the prod data? How fucking stupid do you have to be?

load more comments (12 replies)
[–] Mister_Hangman@lemmy.world 1 points 4 hours ago

Hope he gets sued for defamation now.

[–] thedeadwalking4242@lemmy.world 14 points 8 hours ago (1 children)

Gunnar be honest. It's not a good backup if this can possibly happen. Like LLMs agents are dangerous but if you can just delete everything in 9 seconds then you need to rethink your security practice. No one employee should have that much power.

load more comments (1 replies)
[–] SabinStargem@lemmy.today 61 points 11 hours ago (8 children)

This isn't an AI problem, this is an "Don't allow anyone access your backups without following protocol." problem.

load more comments (8 replies)
[–] alpha1beta@piefed.social -2 points 2 hours ago

I'll never happen to me. I ain't stupid enough to use this shit. I'm not stupid enough to make myself unneeded. But damn, a lot of fucking programmers who I'm sure make 100k+ a year, are stupid fucks working to put themselves on the street by using this shit.

load more comments
view more: next ›