shaman1093

joined 1 year ago
[โ€“] shaman1093@lemmy.ml 2 points 5 months ago

^this - why is it so hard to implement sigh

[โ€“] shaman1093@lemmy.ml 2 points 9 months ago (1 children)

The person that commented below kinda has a point. While I agree that there's nothing special about LLMs an argument can be made that consciousness (or maybe more ego?) is in itself an emergent mechanism that works to keep itself in predictable patterns to perpetuate survival.

Point being that being able to predict outcomes is a cornerstone of current intelligence (socially, emotionally and scientifically speaking).

If you were to say that LLMs are unintelligible as they operate to provide the most likely and therefore most predictable outcome then I'd agree completely.