this post was submitted on 14 Jul 2024
681 points (95.6% liked)
Technology
59605 readers
3302 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't see what's surprising here. They provide services for users globally. Not that it's justified, it's just kind of weird that people think global scale computing is light on electricity, apparently
It's not surprising per se, but it's something that people should be more aware of. And a lot of this consumption is not providing global services (like the Google search or workspace suite) but the whole AI hype.
I didn't find numbers for Google or Microsoft specifically, but training ChatGPT 4 consumed 50 GWh on its own. The daily estimates for queries are estimated between 1-5 GWh.
Given that the extrapolation is an overestimate and calculating the actual consumption is pretty much impossible, it's still probably a lot of energy wasted for a product that people do not want (e.g. Google AI "search", Bing and Copilot being stuffed into everything).
They only do that because they project it to be profitable, i.e. they project demand for it.
It's also ridiculous to claim that people don't want it just because you don't.