msage

joined 1 year ago
[–] msage@programming.dev 6 points 1 month ago

They are not 'faulty', they have been fed wrong training data.

This is the most important aspect of any AI - it's only as good as the training dataset is. If you don't know the dataset, you know nothing about the AI.

That's why every claim of 'super efficient AI' need to be investigated deeper. But that goes against line-goes-up principle. So don't expect that to happen a lot.

[–] msage@programming.dev 14 points 1 month ago (2 children)

Wasn't it proven that AI was having amazing results, because it noticed the cancer screens had doctors signature at the bottom? Or did they make another run with signatures hidden?

[–] msage@programming.dev 8 points 1 month ago (10 children)

What is OpenAI doing with cancer screening?

[–] msage@programming.dev 4 points 2 months ago

Old-man-yelling-at-clouds energy :D

[–] msage@programming.dev 14 points 2 months ago (2 children)

I just wish the whole 'cloud' thing would die in a ditch specifically for people like that.

No, most use-cases don't need to be in a cloud.

You are 99.9% paying more for that setup than having people who understand servers.

And if you need the cloud, then hooray for you, but it should not need to be subsidized by thousands of small customers who jumped on the wrong train.

[–] msage@programming.dev 2 points 2 months ago (1 children)

What about the pasta straws? I've seen it mentioned once, but nothing since.

[–] msage@programming.dev 17 points 2 months ago (1 children)

Just use Free and Open Source Software!

It can always fuck up with updates, but usually you just get more free stuff and it's awesome.

[–] msage@programming.dev 5 points 2 months ago
[–] msage@programming.dev 1 points 2 months ago (1 children)

LLMs just repeat training sets - so every mistake is repeated forever.

Every bias is locked in and can't be fixed.

So you just deny people and expect them to appeal everything... sounds like you are offloading costs on the victims.

Shit like that is what makes it demonic.

[–] msage@programming.dev 1 points 2 months ago (3 children)

It will be used to take control over peoples lives.

In any simple way it may be - denying job/insurance/care/etc, it will be hailed as using 'reason', while it just repeats patterns from the training sets.

It does not 'reason', because it can't. Trying to sell it as such is very dangerous as it will be used against people, and it's dishonest for the investors as well, as they will jump on it even though it's not 'true' and it never will be for this model.

https://softwarecrisis.dev/letters/llmentalist/

[–] msage@programming.dev 1 points 2 months ago
[–] msage@programming.dev 2 points 2 months ago

Not being able to do some things is the biggest blocker.

Sideloading for instance. Photos disappearing inside the Photos app and not being in files is also weird. It just felt like I was a moron who couldn't handle my own files.

view more: ‹ prev next ›