this post was submitted on 23 Dec 2023
15 points (100.0% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] huginn@feddit.it 10 points 11 months ago (3 children)

Friendly reminder that your predictive text, while very compelling, is not alive.

It's not a mind.

[–] JayDee@lemmy.ml 3 points 11 months ago

I don't think most people will care, so long as their NPC interaction ends up compelling. We've been reading stories about people who don't exist for centuries, and that's stopped no one from sympathizing with them - and now there's a chance you could have an open conversation with them.

Like, I think alot of us assume that we care about the authors who write the character dialogs but I think most people actually choose not to know who is behind their favorite NPCs to preserve some sense that the NPC personality isn't manufactured.

Combine that with everyone becoming steadily more lonely over the years, and I think AI-generated NPC interactions are going to take escapism to another level.

[–] MxM111@kbin.social 1 points 11 months ago (2 children)

While it is not alive, whether it is a mind is not a clear cut. It can be called kind of a mind, a mind different of that of human.

[–] match@pawb.social 1 points 11 months ago

What can't be a kind of mind to you?

[–] huginn@feddit.it 0 points 11 months ago (2 children)

Unless you want to call your predictive text on your keyboard a mind you really can't call an LLM a mind. It is nothing more than a linear progression from that. Mathematically proven to not show any form of emergent behavior.

[–] kogasa@programming.dev 1 points 11 months ago (1 children)

No such thing has been "mathematically proven." The emergent behavior of ML models is their notable characteristic. The whole point is that their ability to do anything is emergent behavior.

[–] huginn@feddit.it 1 points 11 months ago* (last edited 11 months ago)

Here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

The field changes fast, I understand it is hard to keep up

[–] MxM111@kbin.social 1 points 11 months ago (1 children)

I do not think that it is “linear” progression. ANN by definition is nonlinear. Neither I think anything is “mathematically proven”. If I am wrong, please provide a link.

[–] huginn@feddit.it 1 points 11 months ago

Sure thing: here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

[–] Poggervania@kbin.social 1 points 11 months ago

Cyberpunk 2077 sorta explores this a bit.

There’s a vending machine that has a personality and talks to people walking by it. The quest chain basically has you and the vending machine chatting a bit and even giving the vending machine some advice on a person he has a crush on. You eventually become friends with this vending machine.

When it seems like it’s becoming more apparent it’s an AI and is developing sentience, it turns out the vending machine just has a really well-coded socializing program. He even admits as much when he’s about to be deactivated.

So, to reiterate what you said: predictive text and LLMs are not alive nor a mind.