Deconceptualist

joined 1 year ago
[–] Deconceptualist@lemm.ee 14 points 4 months ago* (last edited 4 months ago) (1 children)

I mean if you're down to NetBSD as your pick you've probably already made some big concessions so plugging into Ethernet isn't a huge leap at that point.

[–] Deconceptualist@lemm.ee 29 points 4 months ago (3 children)

Most likely yes, as many others have said. Of course you'll likely have to pick a very lightweight DE.

As a fallback there is always NetBSD.

[–] Deconceptualist@lemm.ee 2 points 4 months ago* (last edited 4 months ago)

I almost forgot about SimEarth. For some reason I was allowed to play it in grade school computer lab. I wish they would remake it so I can recreate my sentient cephalopod uprising, except with graphics that aren't complete ass.

I never played SimLife. But no, Spore is really not like SimEarth at all. As the other person said, Spore is disappointingly shallow on all levels.

[–] Deconceptualist@lemm.ee 21 points 4 months ago (6 children)

Does it have to be set in America? I'd think the genre could work almost anywhere with technological cities.

[–] Deconceptualist@lemm.ee 13 points 5 months ago* (last edited 5 months ago) (2 children)

That's not the whole story. "The dog swam across the ocean." is a grammatically valid sentence with correct word order. But you probably wouldn't write it because you have a concept of what a dog actually is and know its physiological limitations make the sentence ridiculous.

The LLMs don't have those kind of smarts. They just blindly mirror what we do. Since humans generally don't put those specific words together, the LLMs avoid it too, based solely on probability. If lots of people started making bold claims about oceanfaring canids (e.g. as a joke), then the LLMs would absolutely jump onboard with no critical thinking of their own.

[–] Deconceptualist@lemm.ee 4 points 5 months ago (1 children)

Ok, maybe there's a possibility someday with that approach. But that doesn't reflect my understanding or (limited) experience with the major LLMs (ChatGPT, Gemini) out in the wild today. Right now they confidently advise ingesting poison because it's grammatically sound and they found it on some BS Facebook post.

If ML engineers can design an internal concept of what constitutes valid information (a hard problem for humans, let alone machines) maybe there's hope.

[–] Deconceptualist@lemm.ee 2 points 5 months ago

Yeah I'm sure folks are working on it, but I'm not knowledgeable or qualified on the details.

[–] Deconceptualist@lemm.ee 46 points 5 months ago (16 children)

As others are saying it's 100% not possible because LLMs are (as Google optimistically describes) "creative writing aids", or more accurately, predictive word engines. They run on mathematical probability models. They have zero concept of what the words actually mean, what humans are, or even what they themselves are. There's no "intelligence" present except for filters that have been hand-coded in (which of course is human intelligence, not AI).

"Hallucinations" is a total misnomer because the text generation isn't tied to reality in the first place, it's just mathematically "what next word is most likely".

https://arstechnica.com/science/2023/07/a-jargon-free-explanation-of-how-ai-large-language-models-work/

[–] Deconceptualist@lemm.ee 3 points 6 months ago

But I've only ever been able to do from the 12nd onward...

[–] Deconceptualist@lemm.ee 4 points 6 months ago* (last edited 6 months ago) (1 children)

In the end, this will probably end up being nothing more than an inconvenience to Spencer, as he can repair it with just a few clicks if he has enough materials in his inventory.

I don't play 76 so whatever, I just think this is a bit funny.

[–] Deconceptualist@lemm.ee 5 points 6 months ago (4 children)

The article seems to say otherwise.

[–] Deconceptualist@lemm.ee 10 points 6 months ago* (last edited 6 months ago) (1 children)

Not yet, but it's not a chance I'd be willing to take. They have at least one neighbor who's supposedly been arrested for theft. He used to watch their dogs for them but when they found out they stopped talking to him and changed the locks.

view more: ‹ prev next ›