TheChurn

joined 1 year ago
[–] TheChurn@kbin.social 5 points 6 months ago (5 children)

There are really only 3 search providers, Google, Bing, and Yandex.

All others will pay one of these three to use their indexes, since creating and maintaining that index is incredibly expensive.

[–] TheChurn@kbin.social 3 points 6 months ago (1 children)

Most OLEDs today ship with logo detection and will dampen the brightness on static elements automatically.

While it isn't a silver bullet, it does help reduce burn in since it is strongly linked to heat, and therefore to the pixel brightness. New blue PHOLEDs are expected to also cut burn in risk. Remember that LCDs also used to have burn in issues, as did CRTs.

[–] TheChurn@kbin.social 2 points 7 months ago (1 children)

I've been using Nvidia under Linux for the last 3 years and it has been massive pita.

Getting CUDA to work consistently is a feat, and one that must be repeated for most driver updates.

Wayland support is still shoddy.

Hardware acceleration on the web (at least with Firefox) is very inconsistent.

It is very much a second-class experience compared to Windows, and it shouldn't be.

[–] TheChurn@kbin.social 97 points 7 months ago* (last edited 7 months ago) (47 children)

Linux and Nvidia really need to sort out their shit so I can fully dump windows.

Luckily the AI hype is good for something in this regard, since running gpus on Linux servers is suddenly much more important.

[–] TheChurn@kbin.social 3 points 7 months ago

No, that's not a real problem either. Model search techniques are very mature, the first automated tools for this were released in the 90s, they've only gotten better.

AI can't 'train itself', there is no training required for an optimization problem. A system that queries the value of the objective function - "how good is this solution" - then tweaks parameters according to the optimization algorithm - traffic light timings - and queries the objective function again isn't training itself, it isn't learning, it is centuries-old mathematics.

There's a lot of intentional and unintentional misinformation around what "AI" is, what it can do, and what it can do that is actually novel. Beyond Generative AI - the new craze - most of what is packaged as AI are mature algorithms applied to an old problem in a stagnant field and then repackaged as a corporate press release.

Take drug discovery. No "AI" didn't just make 50 new antibiotics, they just hired a chemist who graduated in the last decade who understands commercial retrosynthetic search tools and who asked the biopharma guy what functional groups they think would work.

[–] TheChurn@kbin.social 17 points 7 months ago (2 children)

"AI" isn't needed to solve optimization problems, that's what we have optimization algorithms for.

Define an objective and parameters and give the problem to any one of the dozens of general solvers and you'll get approximate answers. Large cities already use models like these for traffic flow, there's a whole field of literature on it.

The one closest to what you mentioned is a genetic algorithm, again a decades-old technique that has very little in common with Generative "AI"

[–] TheChurn@kbin.social 18 points 7 months ago

It looks much better than elden ring in that all the models are much higher quality. Elden Ring was designed around relatively modest assets, and does wonders with what it has, but there is no comparison, DD2 wins hands-down.

As for art direction, that is subjective. Plenty of reasons to prefer looking at ER.

The Witcher 3 is almost a decade old at this point

[–] TheChurn@kbin.social 6 points 8 months ago

Humans are intelligent animals, but humans are not only intelligent animals. We do not make decisions and choose which beliefs to hold based solely on sober analysis of facts.

That doesn't change the general point that a model given the vast corpus of human knowledge will prefer the most oft-repeated bits to the true bits, whereas we humans have muddled our way through to some modicum of understanding of the world around us by not doing that.

[–] TheChurn@kbin.social 9 points 8 months ago (2 children)

But the most current information does not mean it is the most correct information.

I could publish 100 papers on Arxiv claiming the Earth is, in fact, a cube - but that doesn't make it true even though it is more recent than the sphere claims.

Some mechanism must decide what is true and send that information to train the model - that act of deciding is where the actual intelligence in this process lives. Today that decision is made by humans, they curate the datasets used to train the model.

There's no intelligence in these current models.

[–] TheChurn@kbin.social 3 points 9 months ago (1 children)

Copilot is GPT under the hood, it just starts with a search step that finds (hopefully) relevant content and then passes that to GPT for summarization.

[–] TheChurn@kbin.social 15 points 9 months ago

The OG Crysis wanted hardware that still doesn't exist. They built the game and engine under the assumption that clock speeds would keep increasing, and instead we moved to high core counts.

Even today, at 4K and max settings, the original (2007) release can drop below 100 fps on the best possible hardware.

[–] TheChurn@kbin.social 49 points 9 months ago (22 children)

Every billion parameters needs about 2 GB of VRAM - if using bfloat16 representation. 16 bits per parameter, 8 bits per byte -> 2 bytes per parameter.

1 billion parameters ~ 2 Billion bytes ~ 2 GB.

From the name, this model has 72 Billion parameters, so ~144 GB of VRAM

view more: next ›