I know, but it's a ridiculous term. It's so bad it must have been invented or chosen to mislead and make people think it has a mind, which seems to have been successful, as evidenced by the OP
aberrate_junior_beatnik
ChatGPT does not "hallucinate" or "lie". It does not perceive, so it can't hallucinate. It has no intent, so it can't lie. It generates text without any regard to whether said text is true or false.
I still donate to ... uBlock
From https://ublockorigin.com/:
The uBlock Origin project still specifically refuses donations at this time
Who are you giving money to?
Ridiculous. This line is clearly gay.
Everything the Nazis did in the Third Reich was legal. People who resisted them were breaking the law. Maybe we should evaluate things by their impact (pollution/invasion of privacy) rather than their legality.
Technically true, since you could also just replace them with nothing
Executives believe nearly half of the skills that exist in today’s workforce won’t be relevant just two years from now, thanks to artificial intelligence.
-
Executives are such dumbasses
-
That is literally all this "study" did. Ask people how many of their skills they think will be obsoleted. This headline is ridiculous.
Want to exchange information in json? plaintext? binary data? Sockets can do it.
This is exactly why you need something like dbus. If you just have a socket, you know nothing about how the data is structured, what the communication protocol is, etc. dbus defines all this.
Let's see... the upvote arrow is off, so I turn it on, and just walk away!
very old
Obviously it's subjective but Debian doesn't use ancient software. For instance Bookworm has Python 3.11; the current Python is 3.12. Some software updates slowly enough that you end up with the latest version. I seem to recall zsh being up to date. But yeah, make sure you're using the correct version when looking up docs.
I'd be surprised to find out there was one filesystem that consistently did better than others in gaming performance. ext4 is a fine choice, though.
OP clearly expects LLMs to exhibit mind-like behaviors. Lying absolutely implies agency, but even if you don't agree, OP is confused that
The whole point of the post is that OP is upset that LLMs are generating falsehoods and parroting input back into its output. No one with a basic understanding of LLMs would be surprised by this. If someone said their phone's autocorrect was "lying", you'd be correct in assuming they didn't understand the basics of what autocorrect is, and would be completely justified in pointing out that that's nonsense.