this post was submitted on 21 Aug 2025
283 points (97.3% liked)

Technology

74292 readers
4180 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Armok_the_bunny@lemmy.world 141 points 1 day ago (1 children)

Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?

[–] Grimy@lemmy.world 31 points 1 day ago* (last edited 1 day ago) (4 children)

I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.

Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.

[–] fmstrat@lemmy.nowsci.com 25 points 19 hours ago

I'd like to understand what this math was before accepting this as fact.

[–] Damage@feddit.it 59 points 1 day ago (5 children)

If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?

[–] null@lemmy.nullspace.lol 15 points 1 day ago* (last edited 1 day ago) (2 children)

Because demand for data centers is rising, with AI as just one of many reasons.

But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.

Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?

[–] Dojan@pawb.social 8 points 10 hours ago

Sure we do. Do we want the big tech corporations to hold the reins of that though?

[–] anomnom@sh.itjust.works 9 points 16 hours ago (1 children)

Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?

[–] TomArrr@lemmy.world 5 points 10 hours ago

Yes, yes it did. And as far as I can tell, it's still belching it out, just so magats can keep getting owned by it. What a world

https://tennesseelookout.com/2025/07/07/a-billionaire-an-ai-supercomputer-toxic-emissions-and-a-memphis-community-that-did-nothing-wrong/

[–] Imacat@lemmy.dbzer0.com 9 points 1 day ago

To be fair, nuclear power is cool as fuck and would reduce the carbon footprint of all sorts of bullshit.

[–] finitebanjo@piefed.world 7 points 1 day ago* (last edited 1 day ago)

Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.

So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.

[–] Armok_the_bunny@lemmy.world 7 points 1 day ago (1 children)

Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn't help that google has forced me to make a request to their ai every time I run a standard search.

[–] Rentlar@lemmy.ca 19 points 1 day ago (1 children)

Seriously. I'd be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.

The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.

[–] zlatko@programming.dev 5 points 16 hours ago

I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?

[–] douglasg14b@lemmy.world -1 points 17 hours ago* (last edited 17 hours ago) (1 children)

That's not small....

100's of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.

A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to... 600 Megawatts

Or about 60 houses energy usage for a year in the U.S.


It's an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.

[–] xthexder@l.sw0.com 3 points 15 hours ago

That's not ~600 Megawatts, it's 587 Megawatt-hours.

Or in other terms that are maybe easier to understand: 5875 fully charged 100kWh Tesla batteries.

[–] taiyang@lemmy.world 8 points 1 day ago

I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we've had industrial usage of energy and water usage that isn't sustainable... almonds in CA alone are a bigger problem than AI, for instance.

Not that I'm pro-AI cause it's a huge headache from so many other perspectives, but the environmental argument isn't enough. Corpo greed is probably the biggest argument against it, imo.

[–] douglasg14b@lemmy.world -4 points 17 hours ago* (last edited 17 hours ago) (1 children)

A flight to Europe's worth of energy is a pretty asinine way to measure this. Is it not?

It's also not that small the number, being ~600 Megawatts of energy.

However, training cost is considerably less than prompting cost. Making your argument incredibly biased.

Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.

[–] xthexder@l.sw0.com 9 points 15 hours ago

A Megawatt is a unit of power not energy. It means nothing without including the duration, like Megawatt-hours

[–] sbv@sh.itjust.works 50 points 1 day ago (5 children)

In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.

[–] dream_weasel@sh.itjust.works 1 points 9 hours ago

Thank you! I skimmed for that and gave up.

[–] unmagical@lemmy.ml 36 points 1 day ago (1 children)

There are zero downsides when mentally associating an energy hog with "1 second of use time of the device that is routinely used for minutes at a time."

https://xkcd.com/1035/

[–] ArbitraryValue@sh.itjust.works 7 points 1 day ago* (last edited 1 day ago) (3 children)

With regard to sugar: when I started counting calories I discovered that the actual amounts of calories in certain foods were not what I intuitively assumed. Some foods turned out to be much less unhealthy than I thought. For example, I can eat almost three pints of ice cream a day and not gain weight (as long as I don't eat anything else). So sometimes instead of eating a normal dinner, I want to eat a whole pint of ice cream and I can do so guilt-free.

Likewise, I use both AI and a microwave, my energy use from AI in a day is apparently less than the energy I use to reheat a cup of tea, so the conclusion that I can use AI however much I want to without significantly affecting my environmental impact is the correct one.

[–] victorz@lemmy.world 7 points 1 day ago* (last edited 1 day ago) (1 children)

You should probably not eat things because of how much calories they have or don't have, but because of how much of their nutrients you need, and how much they lack other, dangerous shit. Also eat slowly until you're full and no more. Also move a lot.

We shouldn't need calculators for this healthy lifestyle.

The reason for needing to know which foods are healthy is because... well, we forgot.

[–] ArbitraryValue@sh.itjust.works 7 points 1 day ago* (last edited 1 day ago) (1 children)

I'm not saying that ice cream is healthier than a normal dinner, just that if I really crave something sweet then the cost to my health of eating it periodically is actually quite low, whereas the cost of some other desserts (baked sweets are often the worst offenders) is relatively high. That means that a lot can be gained simply by replacing one dessert with a different, equally tasty dessert. Hence my ice cream advocacy.

load more comments (1 replies)
[–] unmagical@lemmy.ml 4 points 1 day ago (2 children)

On a "respond to an individual query" level, yeah it's not that much. But prior to response the data center had to be constructed, the entire web had to be scraped, the models trained, the servers continually ran regardless of load. There's also way too many "hidden" queries across the web in general from companies trying to summarize every email or product.

All of that adds to the energy costs. This equivocation is meant to make people feel less bad about the energy impact of using AI, when so much of the cost is in building AI.

Furthermore, that's the median value--the one that falls right in the middle of the quantity of queries. There's a limit to how much less energy a query to the left of the median can use; there's a significantly higher runway to the right of the median for excess energy use. This also only accounted for text queries; images and video generation efforts are gonna use a lot more.

Your points are valid, but I think that building AI has benefits beyond simply enabling people to use that AI. It advances the state of the art and makes even more powerful AI possible. Still, it would be good to know about the amortized cost per query of building the AI in addition to the cost of running it.

load more comments (1 replies)
[–] dohpaz42@lemmy.world 3 points 1 day ago (1 children)

Individually you’re spot on. Your AI use doesn’t matter. But, and this is where companies tend to leave off, when you take into account how many millions (or billions) of times something is done in a day (like AI prompts), then that’s when it genuinely matters.

To me, this is akin to companies trying to pass the blame to consumers when it’s the companies themselves who are the biggest climate offenders.

[–] ArbitraryValue@sh.itjust.works 5 points 1 day ago (1 children)

I don't see why this argument works better against AI than it does against microwaves. Those are used hundreds of millions of times a day too.

load more comments (1 replies)
[–] Maaji@lemmynsfw.com 13 points 1 day ago (7 children)

This doesn't really track with companies commissioning power plants to support power usage of AI training demand

[–] FaceDeer@fedia.io 5 points 1 day ago

They want to handle lots of prompts.

load more comments (6 replies)
[–] DarkCloud@lemmy.world 12 points 1 day ago* (last edited 1 day ago) (1 children)

The article also mentions each enquiry also evaporates 0.26 of a milliliter of water... or "about five drops".

[–] null@lemmy.nullspace.lol 4 points 1 day ago (1 children)

I wonder how many people clutching their pearls over this also eat meat...

[–] Sxan@piefed.zip 3 points 1 day ago

I'll bet you're a stinking water drinker yourself. Probably a liter or two a day. And probably luxuriating in clean water when you could be using your body to recycling toilet water.

[–] ganksy@lemmy.world 5 points 1 day ago

In addition:

This report was also strictly limited to text prompts, so it doesn’t represent what’s needed to generate an image or a video.

[–] frezik@lemmy.blahaj.zone 19 points 1 day ago (1 children)

The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010.

None of those advanced nuclear projects are yet actually delivering power, AFAIK. They're mostly in planning stages.

The above isn't all to run AI, of course. Nobody was thinking about datacenters just for AI training in 2010. But to be clear, there are 94 nuclear power plants in the US, and a rule of thumb is that they produce 1GW each. So Google is taking up the equivalent of roughly one quarter of the entire US nuclear power industry, but doing it with solar/wind/geothermal that could be used to drop our fossil fuel dependence elsewhere.

How much of that is used to run AI isn't clear here, but we know it has to be a lot.

[–] wewbull@feddit.uk 1 points 8 hours ago

None of those advanced nuclear projects are yet actually delivering power, AFAIK.

...and they won't be for at least 5-10 years. In the meantime they'll just use public infrastructure and then when their generation plans fall through they'll just keep doing that.

[–] rowrowrowyourboat@sh.itjust.works 36 points 1 day ago (1 children)

This feels like PR bullshit to make people feel like AI isn't all that bad. Assuming what they're releasing is even true. Not like cigarette, oil, or sugar companies ever lied or anything and put out false studies and misleading data.

However, there are still details that the company isn’t sharing in this report. One major question mark is the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand.

Why wouldn't they release this. Even if each query uses minimal energy, but there are countless of them a day, it would mean a huge use of energy.

Which is probably what's happening and why they're not releasing that number.

[–] the_q@lemmy.zip 15 points 1 day ago

That's because it is. This is to help fence riders feel better about using a product that factually consumes insane amounts of resources.

[–] NotMyOldRedditName@lemmy.world 16 points 1 day ago (1 children)

There were people estimating 40w in earlier threads on lemmy which was ridiculous.

This seems more realistic.

[–] Ilovethebomb@sh.itjust.works 1 points 22 hours ago

I think that figure came from the article, and was based on some very flawed methodology.

[–] L0rdMathias@sh.itjust.works 16 points 1 day ago (1 children)

median prompt size

Someone didn't pass statistics, but did pass their marketing data presention classes.

Wake me up when they release useful data.

[–] jim3692@discuss.online 12 points 1 day ago (1 children)

It is indeed very suspicious that they talk about "median" and not "average".

For those who don't understand what the difference is, think of the following numbers:

1, 2, 3, 34, 40

The median is 3, because it's in the middle.

The average is 16 (1+2+3+34+40=80, 80/5=16).

[–] HubertManne@piefed.social 4 points 1 day ago

the big thing to me is I want them to compare the same thing with web searches. so they want to use median then fine but median ai query to median google search.

[–] Rhaedas@fedia.io 12 points 1 day ago

Now do training centers, since it's obvious they are never going to settle on a final model as they pursue the Grail of AGI. I could do the exact same comparison with my local computer and claim that running a prompt only uses X amount of watts because the GPU heats up for a few seconds and is done. But if I were to do some fine tuning or other training, that fan will stay on for hours. A lot different.

[–] StrangeMed@lemmy.world 13 points 1 day ago

Nice share! Mistral also shared data about one of its largest model (not the one that answer in LeChat, since that one is Medium, a smaller model, that I guess has smaller energetic requirements)

https://mistral.ai/news/our-contribution-to-a-global-environmental-standard-for-ai

[–] tekato@lemmy.world 7 points 1 day ago

Let’s see OpenAI’s numbers

Microwaves are very energy heavy. This isn’t very reassuring at all.

load more comments
view more: next ›