this post was submitted on 05 Aug 2025
730 points (97.0% liked)

Technology

73734 readers
4332 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] rimjob_rainer@discuss.tchncs.de 100 points 2 days ago* (last edited 2 days ago) (9 children)

I don't get it. AI is a tool. My CEO didn't care about what tools I use, as long as I got the job done. Why do they suddenly think they have to force us to use a certain tool to get the job done? They are clueless, yet they think they know what we need.

[–] Ensign_Crab@lemmy.world 12 points 1 day ago

Because like AI, your CEO is a tool.

[–] buddascrayon@lemmy.world 24 points 1 day ago (2 children)

Because unlike with the other tools you use the CEO of your company is investing millions of dollars into AI and they want a big return on their investment.

[–] Rivalarrival@lemmy.today 4 points 1 day ago

I don't think these CEOs have quite figured out that LLM developers are creating something that can more easily replace a CEO than a developer.

[–] DarkSurferZA@lemmy.world 11 points 1 day ago (1 children)

Return? No, there is no return on investment from AI. If there really was a return to be had from Devs, you wouldn't have to force them to use it.

This is a saving face and covering their asses exercise. Option 1 is "We spent the money, nobody's using it, the bubbles gonna burst", the other choice is "if we can ramp up the usage numbers before the earnings call, we can get some of that sweet investor money to buy us out of being mauled by our shareholders".

It's shitty management, making shitty decisions to cover up their previous shitty decisions

[–] buddascrayon@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

That's the point though. These CEOs don't know that there is not going to be an "AI revolution". They all think they are getting in on the ground floor of the next Google or Facebook. They genuinely believe that these "AI's" are going to revolutionize the Internet.

That's exactly why Elmo went from "AI is too dangerous and development on it must be stopped" to "I'm gonna built the best AI ever and I'll call it Grok cause I want everyone to think I'm a relatable sci-fi nerd."

[–] DarkSurferZA@lemmy.world 1 points 21 hours ago

At this stage,I can't believe that they are stupid enough to believe the shit that comes out of their mouths. Hence I say that it's not about the return from AI, but riding the bubble before it pops, and justifying their stupidity to others by force feeding the crap they spent their money on down our throats in the hope that we don't just throw it up on them

[–] bless@lemmy.ml 58 points 2 days ago (3 children)

GitHub is owned by Microsoft, and Microsoft is forcing AI on all the employees

[–] TeddE@lemmy.world 18 points 2 days ago (1 children)

Honestly I've been recommending setting up a personal git store and cloning any project you like, I imagine the next phase of this is Microsoft making a claim that if Copilot 'assisted' all these projects, Microsoft is a part owner of all these projects - in a gambit to swallow and own open source.

[–] 3dcadmin@lemmy.relayeasy.com 2 points 22 hours ago

loads of self hosted alternatives if you so require it

[–] ksh@aussie.zone 3 points 1 day ago (1 children)

They all need to be sued for unethical “Embrace, Extend and Extinguish” practices again

[–] end_stage_ligma@lemmy.world 3 points 1 day ago (1 children)

best I can do is rusty dull guillotine

[–] 3dcadmin@lemmy.relayeasy.com 1 points 22 hours ago

nah rusty and a blade that is serrated.... make way more mess. As an aside the guillotine was designed for theatre, the mechanism actually makes that loud noise on purpose! Pointless to kill someone without a bit of theatre don't ya think

[–] Corkyskog@sh.itjust.works 6 points 2 days ago* (last edited 2 days ago) (1 children)

I am surprised they aren't embracing it... I would. You immediately get some vague non person to blame all your failures on.

Employers aren't loyal enough for the average person to care about their companies well being.

[–] rozodru@lemmy.world 7 points 2 days ago

I agree, let them generate massive tech debt cause right now the majority of my current clients have hired me to clean up their AI slop.

is it bad for their users? oh hell yes it is. Is it great for me an other consultants/freelancers? hell yes it is. Best thing that's ever happened to my wallet recently are vibe coders. I love those dumb prompt monkeys.

[–] sobchak@programming.dev 16 points 1 day ago

I think part of it is because they think they can train models off developers, then replace them with models. The other is that the company is heavily invested in coding LLMs and the tooling for them, so they are trying to hype them up.

[–] Jhex@lemmy.world 18 points 1 day ago

Why do they suddenly think they have to force us to use a certain tool to get the job done?

Not just that... why do they have to threat and push for people to use a tool that allegedly is fantastic and makes everything better and faster?... the answer is that it does not work but they need to pump the numbers to keep the bubble going

[–] MajorasMaskForever@lemmy.world 14 points 1 day ago (1 children)

It's not about individual contributors using the right tools to get the job done. It's about needing fewer individual contributors in the first place.

If AI actually accomplishes what it's being sold as, a company can maintain or even increase its productivity with a fraction of its current spending on labor. Labor is one of the largest chunks of spending a company has so, if not the largest, so reducing that greatly reduces spending which means for same or higher company income, the net profit goes up and as always, the line must go up.

tl;dr Modern Capitalism is why they care

[–] Tamo240@programming.dev 4 points 1 day ago

Alternatively, following their logic, keep the number of people and achieve massively higher productivity. But they don't want that, they want to reduce the number of people having opinions and diluting the share pool, because its not about productivity, its about exerting control.

[–] 0x0@lemmy.zip 14 points 2 days ago

They are clueless, yet they think they know what we need.

Accurate description of most managers i've encountered.

[–] SaharaMaleikuhm@feddit.org 7 points 1 day ago

Because they make money selling you the AI. It's that simple.

[–] CeeBee_Eh@lemmy.world 1 points 1 day ago (1 children)

They are clueless, yet they think they know what we need.

AI make money line go up. It's not clueless, he's trying to sell a kind of snake oil (ok, not "snake oil", I don't think AI is entirely bad).

[–] ragas@lemmy.ml 1 points 1 day ago (2 children)

Snake oil is also not entirely bad. The placebo effect actually works.

[–] 3dcadmin@lemmy.relayeasy.com 1 points 22 hours ago

snake oil made me sssssssick (yes the dad jokes are weak today)

[–] CeeBee_Eh@lemmy.world 0 points 1 day ago

No, snake oil is extremely bad. It's a highly exploitative practice that preys on the desperation of sick people.

That's what "snake oil" refers to. Exploiting someone by playing their emotions.

The placebo effect actually works.

The placebo effect sometimes works. But only in very specific circumstances. A placebo will not cure cancer or heart disease.

It can help with things related to pain, as mental and emotional state can directly affect the severity of pain. And a placebo can sometimes marginally improve symptoms by reducing stress levels. But that's why placebos are used during drug trials. If a drug produces the same results as a placebo, then it doesn't work. And that says a lot about what the placebo effect actually is. It's just a mental state change that gets expressed as reduced physiological stress.