this post was submitted on 12 Apr 2026
685 points (94.8% liked)
Technology
84502 readers
3632 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can actually set it up to give the same outputs given the same inputs (temperature = 0). The variability is on purpose
You can, at that will cause the same output on the same input if there is no variation in floating point rounding errors. (True if the same code is running but easy when optimizing to hit a round up/down and if the tokens are very close the output will diverge)
The point the people (or llm arguing against llms) miss is the world is not deterministic, humans are not deterministic (at least in a practical way at the human scale). And if a system is you should indeed not use an llm.. Its powere is how it provides answers with messy data.. If you need repeatability make a scripts / code ect.
(Note I do think if the output is for human use it's important a human validate its useful.. The llms can help brainstorm, can with some tests manage a surprising amount of code, but if you don't validate and test the code it will be slop and maybe work for one test but not for a generic user.
There are more aspects to the randomness such as race conditions and intentionally nondeterministic tiebreaking when tokens have the same probability, apparently.
I actually think LLMs are ill suited for the vast majority of things people are currently using them for, and there are obviously the ethical problems with data centers bringing new fossil fuel power sources online, but the technology is interesting in and of itself
This is not hard stuff to understand, if you understand computing.
This is not the definition of determinism. You are adding qualifications.
I did look it up and I see now there are other factors that aren't under your control if you're using a remote system, so I'll amend my statement to say that you can have deterministic inference systems, but the big ones most people use cannot be configured to be by the user.
You don't have to understand a deterministic system for it to be deterministic. You are making that up.
I conceded that setting temperature to 0 for an arbitrary system (including all the remote ones most people are using) does not mean it is deterministic after reading about other factors that influence inference in these systems. That does not mean there are not deterministic implementations of LLM inference, and repeating yourself with NO additional information and using CAPS does NOT make you more CORRECT lol.
I said I was wrong in that my statement was overly broad and not applicable to the systems most people are using in my initial response to you, then clarified that it is not an intrinsic character of the technology at large but that the implementations that are most used have it.
You apparently think that conversations are a battle with winners and losers so the fact you were right that the biggest systems are nondeterministic for reasons outside of temperature configuration means it doesn't matter why, doesn't matter that those factors don't have to apply to every inference system, and doesn't matter that you have no idea what determinism means.
In any case talking to you seems like a waste of time, so enjoy your sad victory lap while I block you so I don't make the mistake of engaging you assuming you're an earnest interlocutor in the future.