582
Anon conserves power
(sh.itjust.works)
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).
So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.
It's usually a lot faster in producing outputs though.
How many R in strawberry ?
how many R not in strawberry?