sigh this isn't how any of this works. Repeat after me: LLMs. ARE. NOT. INTELLIGENT. They have no reasoning ability and have no intent. They are parroting statistically-likely sequences of words based on often those sequences of words appear in their training data. It is pure folly to assign any kind of agency to them. This is speculative nonsense with no basis in actual technology. It's purely in the realm of science fiction.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
The Vile Offspring from the book Accelerando.
Vile Offspring: Derogatory term for the posthuman "weakly godlike intelligences" that inhabit the inner Solar System by the novel's end.
Also Aineko
Aineko, is not a talking cat: it's a vastly superintelligent AI, coolly calculating, that has worked out that human beings are more easily manipulated if they think they're dealing with a furry toy. The cat body is a sock puppet wielded by an abusive monster.
Is that something like a "class II perversion"? For example the Straumli Blight.
An not familiar with that. Can you elaborate?
It's a reference to A Fire Upon The Deep, an SF novel by Vernor Vinge. One way to describe it is that a superintelligent computer virus tries to take over the galaxy. It is great, try a web search.
How ironic would it be that AI ruins the internet and we all go back to disconnected machines with physical/local storage media? Eg. Installing programs from trusted companies off of a CD or USB drive.
You mean trusted Open Source projects.
Even those are vulnerable. You just need one to trick the it guy. Unlike traditional viruses, these could evolve versions that specialize in social engineering.
Agreed but being disconnected makes the impact of a lot of the viruses that might be generated with LLMs not worthwhile because of its isolation. Of course, you also lose all the benefits of being connected. All hypotheticals. :)
LLM Viruses will be like how the hippy free love concept died during the aids epidemic.
No more having powerful computerized all connected together.