this post was submitted on 15 Sep 2024
649 points (97.4% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/20289663

A report from Morgan Stanley suggests the datacenter industry is on track to emit 2.5 billion tons by 2030, which is three times higher than the predictions if generative AI had not come into play.

The extra demand from GenAI will reportedly lead to a rise in emissions from 200 million tons this year to 600 million tons by 2030, thanks largely to the construction of more data centers to keep up with the demand for cloud services.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Matriks404@lemmy.world -3 points 2 months ago (1 children)

Does that take into account that AI models will become more efficient with time?

[โ€“] postmateDumbass@lemmy.world 2 points 2 months ago* (last edited 2 months ago)

We never actually used large numbers of monkeys paired with typewritters to produce new literature.

Why? Because it would have wasted all the bananas to produce a bunch of shit.

That is all this level of AI is really equivilant to.

Throwing pudding at a wall, deciding if that toss is closer to the goal than before, changing something, then repeating.

Maybe dont waste the resources until the process is more efficient.