HeavyDogFeet

joined 1 year ago
[–] HeavyDogFeet@lemmy.world 3 points 8 months ago

Not really. A 4K movie means nothing to 99% of people. Is it 4GB? 40? 400? How many can my phone hold? Or my computer?

This only makes things more understandable if you use a point of reference that everyone you're talking to is familiar with. The fact that they had to then explain how big a 4K movie is in the article clearly shows that even they know that this doesn't help people. It's just a big flashy number.

Just for context, I'm a writer, I understand the point of using these abstract measures to give a frame of reference. But in this case, just giving the capacity in GB/TB would have been easier to understand. It just wouldn't have been as sensational of a headline.

[–] HeavyDogFeet@lemmy.world 157 points 9 months ago* (last edited 9 months ago) (24 children)

What a useless headline. God forbid they just give the actual capacity rather than some abstract, bullshit, flexible measure that means nothing to anyone.

[–] HeavyDogFeet@lemmy.world 3 points 9 months ago* (last edited 9 months ago)

But you’re still limited to the opinions of people who post on Lemmy, which, as someone who occasionally posts on Lemmy, is not a shining beacon of quality.

Even if I just went by what I get on the first page of a Google search, I’d expect I’d find what I need much faster using Google than I would using Lemmy based purely on the volume of info Google has access to. And that’s not even taking into account things like Google’s ability to search within other sites.

Unless Lemmy has gotten like 100 billions times better in the last week, this isn’t even a fair comparison.

Edit: lol, just realised you’re the same guy from the Nvidia thread.

[–] HeavyDogFeet@lemmy.world 3 points 9 months ago* (last edited 9 months ago)

Nvidia makes some of the best hardware for training AI models. Increased investment in AI will inevitably increase demand for Nvidia hardware. It may boost other hardware makers too, but Nvidia is getting the biggest boost by far.

Maybe I’m being dumb or missing something but this feels incredibly obvious to me.

[–] HeavyDogFeet@lemmy.world 3 points 9 months ago (3 children)

LLM tools can already write basic code and will likely improve a lot, but there are more reasons to learn to code than to actually do coding yourself. Even just to be able to understand and verify the shit the AI tools spit out before pushing it live.

Nvidia knows that the more people who use AI tools, the more their hardware sells. They benefit directly from people not being able to code themselves or relying more on AI tools.

They want their magic line to keep going up. That’s all.

[–] HeavyDogFeet@lemmy.world 4 points 9 months ago (6 children)

It makes no sense. AI tools will obviously have an impact on the profession development, but suggesting that no one should learn to code is like saying no one should learn to drive because one day cars will drive themselves. It’s utter self-serving nonsense.

[–] HeavyDogFeet@lemmy.world 15 points 9 months ago

This is objectively stupid. There are tonnes of things you learn in maths that are useful for everyday life even if you don’t do the actual calculations by hand.

[–] HeavyDogFeet@lemmy.world 3 points 9 months ago

I deleted all my posts before closing my accounts back when they were breaking third-party apps, although I'm sure they probably kept a private log of all posts specifically for this purpose.

To be honest, I expect AI companies are scraping Lemmy and other places for training data anyway, but I'd rather Reddit specifically not make any money off my posts.

[–] HeavyDogFeet@lemmy.world 27 points 9 months ago (1 children)

Realistically, a couple of 10TB drives would have me covered for like a decade at least. If these massive drives bring down the price of much smaller ones, I'm a happy boy.

[–] HeavyDogFeet@lemmy.world 1 points 9 months ago

I don't mean to be dismissive of your entire train of thought (I can't follow a lot of it, probably because I'm not a dev and not familiar with a lot of the concepts you're talking about) but all the things you've described that I can understand would require these tools to be a fuckload better, on an order we haven't even begun to get close to yet, in order to not be super predictable.

It's all wonderful in theory, but we're not even close to what would be needed to even half-ass this stuff.

[–] HeavyDogFeet@lemmy.world 4 points 9 months ago (3 children)

That remains to be seen. We have yet to see one of these things actually get good at anything, so we don’t know how hard that last part is to do. I don’t think we can assume there will be continuous linear progress. Maybe it’ll take one year, maybe it’ll take 10, maybe it’ll just never reach that point.

view more: next ›