this post was submitted on 06 May 2024
346 points (95.1% liked)
Technology
59674 readers
3115 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sounds like some sensationalized bullshit. They don't give a single number or meaningful statement and they are paywalled.
I don't disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.
Pair this with a huge AI boom and corporate hype cycle, and it wouldn't surprise me if it was consuming an incredible amount of power. It's reminiscent of Bitcoin, from a resource perspective.
No, it makes no sense. India has over a billion people. There's no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.
If only there had been another widespread, wasteful prior use of expensive and power hungry compute equipment that suddenly became less valuable/effective and could quickly be repurposed to run LLMs...
Pretty sure the big AI corps aren't depending on obsolete second-hand half-burned-out Ethereum mining rigs for their AI training.