this post was submitted on 25 Jun 2025
603 points (98.4% liked)

Greentext

6621 readers
1573 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] LostXOR@fedia.io 65 points 1 week ago (8 children)

This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.

[–] Zacryon@feddit.org 10 points 1 week ago (4 children)

It's usually a lot faster in producing outputs though.

[–] RushLana@lemmy.blahaj.zone 10 points 1 week ago (3 children)

how many R not in strawberry?

load more comments (2 replies)
load more comments (2 replies)
load more comments (5 replies)