this post was submitted on 30 May 2024
102 points (91.1% liked)
Technology
59534 readers
3199 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
OK... warning: wall of text incoming.
TL/DR: We end up comparing LLM executions with Google searches (a single prompt to ChatGPT uses about 10x as much electricity as a single Google search execution). How many Google searches and links do you need to click on vs requesting information from ChatGPT? I also touch on different use cases beyond just the use of LLMs.
The true argument comes down to this: Is the increase in productivity worth the boost in electricity? Is there a better tool out there that makes more sense than using an AI Model?
For the first article:
The only somewhat useful number in here just says that Microsoft had 30% higher emissions than what it's goals were from 2020... that doesn't breakdown how much more energy AI is using despite how much the article wants to blame the training of AI models.
The second article was mostly worthless, again pointing at numbers from all datacenters, but conveniently putting 100% of the blame on AI throughout most of the article. But, at the very end of the article it finally included something a bit more specific as well as an actual source:
Link to source: https://www.iea.org/reports/electricity-2024
A 170 page document by the International Energy Agency.
Much better.
Page 8:
Not a very useful number since it's lumping in cryptocurrency with all Data centers and "AI".
Again, mixing AI numbers with all datacenters.
Page 35:
OK, I'm assuming this is where they got their 10x figure, but this does not necessarily mean the same thing as using 10x more electricity especially if you're trying to compare traditional energy use for specific tasks to the energy use required by executing a trained AI Model.
Page 34:
Link to source of that number: https://www.sciencedirect.com/science/article/abs/pii/S2542435123003653?dgcid=author
It's behind a paywall, but if you're on a college campus or at certain libraries you might be able to access it for free.
Finally we have some real numbers we can work with. Let's break this down. A single Google search uses a little more than 1/10th of a request made to ChatGPT.
So here's the thing, how many times do you have to execute a Google search to get the right answer? And how many links do you need to click on to be satisfied? It's going to depend based on what you're looking for. For example, if I'm working on doing some research or solving a problem, I'll probably end up with about 10-20 browser tabs open at the same time by the time I get all of the information I need. And don't forget that I have to click on a website and load it up to get more info. However, when I'm finally done, I get the sweet satisfaction of closing all the tabs down.
Compare that to using an LLM, I get a direct answer to what I need, I then do a little double checking to verify that the answer is legitimate (maybe 1-2 Google equivalent searches), and I'm good to go. Not only have I spent less time overall on the problem, but in some cases I might have even used less electricity after factoring everything in.
Let's try a different use case: Images. I could spend hours working in Photoshop to create some image that I can use as my Avatar on a website. Or I can take a few minutes generating a bunch of images through Stable Diffusion and then pick out one I like. Not only have I saved time in this task, but I have used less electricity.
In another example I could spend time/electricity to watch a Video over and over again trying to translate what someone said from one language to another, or I could use Whisper to quickly translate and transcribe what was said in a matter of seconds.
On the other hand, there are absolutely use cases where using some ML model is incredibly wasteful. Take, for example, a rain sensor on your car. Now, you could setup some AI model with a camera and computer vision to detect when to turn on your windshield wipers. But why do that when you could use this little sensor that shoots out a small laser against the window and when it detects a difference in the energy that's normally reflected back it can activate the windshield wipers. The dedicated sensor with a low power laser will use far less energy and be way more efficient for this use case.
Of course we still need to factor in the amount of electricity that's required to train and later fine-tune a model. Small models only need a few seconds-minutes to train. Other models may need about a month or more to train. Once the training is complete, no more electricity is required, the model can be packaged up and spread out over the internet like any other file (of course electricity is used for that, but then you might as well complain about people streaming 8k video to their homes for entertainment purposes).
So everything being said, it really comes down to this:
Does the increase in productivity warrant the bump in electricity usage?
Is there a better tool out there that makes more sense than using an AI Model?
Thank you for your effort.
Couple of takeaways:
I think we can use Bitcoin difficulty chart to approximate how much crypto weighs in the AI / crypto mix. BTC difficulty stopped increasing in 2024 which could be partially explained by both competing for same resources. The other big one, Ethereum moved to proof of stake fairly recently and I think it's an attractive proposition for other crypto given the above. With this in mind it's fair to say crypto won't be a big factor compared to AI growth and I would expect researchers to come to somewhat similar conclusions.
As to how good AI is at things:
The last one is key I think. Since AI is the current buzzword companies will try to shoehorn it everywhere, regardless of it making sense.
Bitcoin difficulty chart - good point.
Effectiveness of AI powered search - Agreed, it is a very subjective topic. I don't use LLMs for the majority of my searches (who needs hallucinated dates and times for the movies playing at a cinema near me?) and it sounds like Google is trying to use their LLM with every search now... In my opinion we should have a button to activate the LLM on a search rather than have it respond every time (but I don't really use Google search anyway).
Translation/Transcription tech - It's incredibly useful for anyone who's deaf. Your average person doesn't need this, although I'm sure they benefit from the auto-generated subtitles if they're trying to watch a video in a noisy environment (or with the volume off).
In my own personal use I've found it useful for cutting through the nonsense posted by both sides of either the Ukraine/Russia conflict or the Israel/Gaza conflict (in the case of misinformation targeting those who don't speak the language).
Generative AI - Yeah, this will be interesting to see how it plays out in courts. I definitely see good points raised by both sides, although I'm personally leaning towards a ruling that would allow smaller startups/research groups to be able to compete with larger corporations (when they will be able to buy their way into training data). It'll be interesting to see how these cases proceed on the text vs audio vs image/art fronts.
Wasteful AI - Agreed... too many companies are jumping in on the "AI" bandwagon without properly evaluating whether there's a better way to do something.
Anyway, thanks for taking the time to read through everything.