this post was submitted on 26 Feb 2026
29 points (87.2% liked)

Technology

81907 readers
3015 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zikeji@programming.dev 2 points 18 hours ago* (last edited 18 hours ago)

I mean, it's very possible an it was written by "an AI" (an LLM). For all we know the prompt the user gave it was something along the line of "get your pull requests accepted no matter the cost" and it's fancy text prediction decided, in it's ever ongoing roleplay, that the targeted blog post would shame the developer into accepting it's PR.

I definitely don't under the paranoia though. I don't understand how people are convincing themselves any of this so close to actual intelligence. Ask your fancy LLM how to fix your cup that "is sealed at the top and "open at the bottom" or if you should drive to the car wash to get a car wash if it's only 100ft away - both scenarios obvious to most any human and will need to be trained out of the current leading LLMs (if they haven't been patched already).