this post was submitted on 08 Apr 2026
237 points (98.0% liked)

Technology

83600 readers
4162 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] partofthevoice@lemmy.zip 19 points 1 day ago (8 children)

LLMs don’t try anything. They are deterministic tools.

[–] vk6flab@lemmy.radio 7 points 22 hours ago (7 children)

While I understand your point, deterministic with a billion variables is beyond human ability to process, let alone the multi-billion parameter models in general circulation today.

At what point does deterministic descend into random?

Assumed Intelligence is a solution for a bunch of multivariate problems, like say "the travelling salesman", but it's not intelligence nor in my opinion is it effectively "deterministic".

[–] partofthevoice@lemmy.zip 2 points 17 hours ago (6 children)

While I understand your point, deterministic with a billion variables is beyond human ability to process, let alone the multi-billion parameter models in general circulation today.

Fair enough. There’s a significant difference in complexity between the surface implication of what I said versus reality. Yes, it’s deterministic, but it’s also complex enough that something more should be said… though, we need to be careful here. Our language is not mature enough to scaffold the precise concepts we need here, and attempting to do so regardless carries the risk of smuggling in many concepts we did not intend to smuggle in. Concepts like intent, for example. I agree with you, but cautiously.

At what point does deterministic descend into random?

It shouldn’t at any point. Instead, we’re discussing a system that’s similar to the double pendulum or three body problem. It’s deterministic, though computationally irreducible. That’s chaotic, but it is not random. It’s extremely sensitive to initial conditions.

[–] Junkasaurus@lemmy.world 1 points 15 hours ago (2 children)

What are you saying precisely? It’s well known that LLMs have non-deterministic output (Ilya Sutskever even claims as such). Are you saying the way it goes about retrieving tokens as deterministic?

[–] partofthevoice@lemmy.zip 1 points 13 hours ago (1 children)

I think you’re right about that, but it is artificial nondeterminism in the sense that it’s relying on several algorithmic factors and, more subtly, device differences. The system itself is a complex yet deterministic function.

[–] Junkasaurus@lemmy.world 1 points 6 hours ago (1 children)

I can agree with that largely but I still contend you’re conflating a few things to make that argument. Fundamentally an LLM will make predictions based on probability (ignoring temperature) and probability does not equal certainty.

[–] partofthevoice@lemmy.zip 1 points 4 hours ago

I would argue that’s empirically true but not fundamentally true. Actually, I’d argue that my point is the fundamental truth here. Computers still cannot generate random output. They simulate the process, and it’s not truly random. It’s just good enough to fool us at the surface level.

[–] vk6flab@lemmy.radio 1 points 13 hours ago (1 children)

They are deterministic but complex to determine.

The Assumed Intelligence systems I'm familiar with have a "random" element, but it's unclear where that source of randomness comes from. It it using a computational random source, or something like the lava lamp wall at Cloudflare, which is significantly more random, potentially actually random.

[–] Junkasaurus@lemmy.world 1 points 6 hours ago

It’s temperature primarily. That being said there is still a chance that an LLM can output values that are unexpected even at low temperatures.

load more comments (3 replies)
load more comments (3 replies)
load more comments (3 replies)