283
In a first, Google has released data on how much energy an AI prompt uses
(www.technologyreview.com)
This is a most excellent place for technology news and articles.
Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?
I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?
Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.
So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.