this post was submitted on 02 Mar 2025
183 points (89.6% liked)

Technology

64653 readers
4307 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Bloomcole@lemmy.world 5 points 5 days ago

garbage in - garbage out

[–] lemmie689@lemmy.sdf.org 65 points 1 week ago (44 children)

Gotta quit anthropomorphising machines. It takes free will to be a psychopath, all else is just imitating.

[–] KeenFlame@feddit.nu -2 points 6 days ago (1 children)
[–] lemmie689@lemmy.sdf.org 3 points 6 days ago (1 children)
[–] KeenFlame@feddit.nu -2 points 5 days ago (1 children)

To imitate or fit the training data. It's useful.

[–] lemmie689@lemmy.sdf.org 5 points 5 days ago (1 children)

I don't think it's useful to anthropomorphise it.

[–] KeenFlame@feddit.nu -1 points 5 days ago (1 children)
[–] lemmie689@lemmy.sdf.org 2 points 4 days ago (1 children)

You would have to look up the meaning of anthropomorphism if it's not clear.

[–] KeenFlame@feddit.nu 0 points 4 days ago

I know what it means, I just don't understand what you are referring to? Who has anthropomorphised it?

load more comments (43 replies)
[–] Australis13@fedia.io 38 points 1 week ago (1 children)

This makes me suspect that the LLM has noticed the pattern between fascist tendencies and poor cybersecurity, e.g. right-wing parties undermining encryption, most of the things Musk does, etc.

Here in Australia, the more conservative of the two larger parties has consistently undermined privacy and cybersecurity by implementing policies such as collection of metadata, mandated government backdoors/ability to break encryption, etc. and they are slowly getting more authoritarian (or it's becoming more obvious).

Stands to reason that the LLM, with such a huge dataset at its disposal, might more readily pick up on these correlations than a human does.

[–] AffineConnection@lemmy.world 1 points 5 days ago* (last edited 5 days ago) (1 children)

No, it does not make any technical sense whatsoever why an LLM of all things would make that connection.

[–] Australis13@fedia.io 2 points 5 days ago

Why? LLMs are built by training maching learning models on vast amounts of text data; essentially it looks for patterns. We've seen this repeatedly with other behaviour from LLMs regarding race and gender, highlighting the underlying bias in the dataset. This would be no different, unless you're disputing that there is a possible correlation between bad code and fascist/racist/sexist tendencies?

load more comments
view more: next ›