this post was submitted on 05 Jun 2024
408 points (96.6% liked)

Technology

59674 readers
3235 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] bamboo@lemm.ee 22 points 5 months ago (10 children)

I don’t think generative AI is going anywhere anytime soon. The hype will eventually die down, but it’s already proved its usefulness in many tasks.

[–] neshura@bookwormstory.social 16 points 5 months ago (9 children)

Is AI useful? Maybe. But is it profitable? AI will go the same way .com did: there will be a massive crash and at the end of that you'll see who actually had their pants on

[–] bamboo@lemm.ee -1 points 5 months ago (5 children)

It can be quite profitable. A ChatGPT subscription is $20/m right now, or $240/year. A software engineer in the US is between $200k and $1m with all benefits and support costs considered. If that $200k engineer can use ChatGPT to save 2.5 hours in a year, then it pays for itself.

[–] frezik@midwest.social 4 points 5 months ago* (last edited 5 months ago) (1 children)

I've seen pull requests filled with ChatGPT code. I consider my dev job pretty safe.

[–] bamboo@lemm.ee 0 points 5 months ago

ChatGPT isn’t gonna replace software engineers anytime soon. It can increase productivity though, that’s the value LLMs provide. If someone made a shitty pull request filled with obvious ChatGPT output, that’s on them and not the technology. Blaming ChatGPT for a programmer’s bad code is like blaming the autocomplete in their editor for bad code: just because the editor suggests it doesn’t mean you have or should accept it if it’s wrong.

load more comments (3 replies)
load more comments (6 replies)
load more comments (6 replies)