582
Anon conserves power (sh.itjust.works)
you are viewing a single comment's thread
view the rest of the comments
[-] LostXOR@fedia.io 66 points 2 days ago

This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.

[-] Zacryon@feddit.org 9 points 2 days ago

It's usually a lot faster in producing outputs though.

[-] RushLana 10 points 2 days ago
[-] pewgar_seemsimandroid 13 points 2 days ago

how many R not in strawberry?

load more comments (2 replies)
this post was submitted on 25 Jun 2025
582 points (100.0% liked)

Greentext

6569 readers
571 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS