this post was submitted on 12 Apr 2026
679 points (95.1% liked)

Technology

83784 readers
3871 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] wonderingwanderer@sopuli.xyz 1 points 10 hours ago (1 children)

That makes sense. I see the problem with that, and I don't have a good solution for it. It is a divergence of topic though, as we were discussing open-source programmers using LLMs which are potentially trained on closed-source code.

LLMs trained on open-source code is worth its own discussion, but I don't see how it fits in this thread. The post isn't about closed-source programmers using LLMs.

Besides, closed-source code developers could've been stealing open-source code all along. They don't really need AI to do that.

Still, training LLMs on open-source code is a questionable practice for that reason, particularly when it comes to training commercial models on GPL code. But it's probably hard to prove what code was used in their datasets, since it's closed-source.

[–] ricecake@sh.itjust.works 1 points 2 hours ago

I don't really see it as a divergence from the topic, since it's the other side of a developer not being responsible for the code the LLM produces, like you were saying.
In any case, it's not like conversations can't drift to adjacent topics.

Besides, closed-source code developers could've been stealing open-source code all along. They don't really need AI to do that.

Yes, but that's the point of laundering something. Before if you put foss code in your commercial product a human could be deposed in the lawsuit and make it public and then there's consequences. Now you can openly do so and point at the LLM.

People don't launder money so they can spend it, they launder money so they can spend it openly.

Regardless, it wasn't even my comment, I just understood what they were saying and I've already replied way out of proportion to how invested I am in the topic.