huginn

joined 1 year ago
[–] huginn@feddit.it 2 points 10 months ago

Model collapse is going to be a big deal and it doesn't take too much poisoned content to cause model collapse.

[–] huginn@feddit.it 1 points 10 months ago

Have found, not will find.

There are so many spam sites with LLM content.

[–] huginn@feddit.it 29 points 10 months ago

I think 19 reuses of a single rocket is more impressive here TBH

[–] huginn@feddit.it 57 points 10 months ago (4 children)

Not even close to the second time. It's happening constantly but is getting missed.

Too many people think LLMs are accurate.

[–] huginn@feddit.it 2 points 10 months ago

11 years? Nevermind use the laptop for sure haha

[–] huginn@feddit.it 5 points 10 months ago (4 children)

You'll probably save money in the long run using a pi.

[–] huginn@feddit.it 1 points 11 months ago* (last edited 11 months ago)

Here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

The field changes fast, I understand it is hard to keep up

[–] huginn@feddit.it 1 points 11 months ago

Sure thing: here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

[–] huginn@feddit.it 0 points 11 months ago (4 children)

Unless you want to call your predictive text on your keyboard a mind you really can't call an LLM a mind. It is nothing more than a linear progression from that. Mathematically proven to not show any form of emergent behavior.

[–] huginn@feddit.it 10 points 11 months ago (9 children)

Friendly reminder that your predictive text, while very compelling, is not alive.

It's not a mind.

[–] huginn@feddit.it 7 points 11 months ago (1 children)

In my experience DDG and Google are pretty close to the same quality. IE neither is good.

[–] huginn@feddit.it 4 points 11 months ago

Doesn't matter if it's a prerequisite

view more: ‹ prev next ›