this post was submitted on 08 Apr 2024
70 points (85.0% liked)

Technology

59627 readers
2911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The Hated One has been pretty solid in the past regarding privacy/security, imho. I found this video of his rather enlightening and concerning.

  • LLMs and their training consume a LOT of power, which consumes a lot of water.
  • Power generation and data centers also consume a lot of water.
  • We don't have a lot of fresh water on this planet.
  • Big Tech and other megacorps are already trying to push for privatizing water as it becomes more scarce for humans and agriculture.

---personal opinion---

This is why I personally think federated computing like Lemmy or PeerTube to be the only logical way forward. Spreading out the internet across infrastructure nodes that can be cooled by fans in smaller data centers or even home server labs is much more efficient than monstrous, monolithic datacenters that are stealing all our H2O.

Of course, then the 'Net would be back to serving humanity instead of stock-serving megacultists. . .

you are viewing a single comment's thread
view the rest of the comments
[–] MonkeMischief@lemmy.today 11 points 7 months ago (7 children)

Also, I can't even imagine how many resources image-generating AIs take up, especially when it's all based around "refining prompts" over and over and over....

[–] Even_Adder@lemmy.dbzer0.com 0 points 7 months ago (5 children)

It's a lot less than playing a video game. My fans on my GPU spin up harder and for more sustained time whenever I'm playing.

[–] TheHobbyist@lemmy.zip 10 points 7 months ago (4 children)

I think the training part is not to be neglected and might be what is at play here. Facebook has a 350k GPU cluster which is being setup to train AI models. Typical state of the art models have required training for months on end. Imagine the power consumption. Its not about on person running a small quantized model at home.

[–] FaceDeer@fedia.io -1 points 7 months ago

Such training can be done in places where there's plenty of water to spare. Like so many of these "we're running out of X!" Fears, basic economics will start putting the brakes on long before crashing into a wall.

load more comments (3 replies)
load more comments (3 replies)
load more comments (4 replies)