614
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments

500 tons of CO2 is... surprisingly little? Like, rounding error little.

I mean, one human exhales ~400 kg of CO2 per year (according to this). Training GPT-3 produced as much CO2 as 1250 people breathing for a year.

[-] RushLana 2 points 3 days ago

That seems so little because it doesn't account for the data-centers construction cost, hardware production cost, etc... 1 model costing as much as 1250 people breathing for a year is enormous to me.

[-] OfCourseNot@fedia.io 2 points 3 days ago

I don't know why people downvoted you. It is surprisingly little! I checked the 500 tons number thinking it could be a typo or a mistake but I found the same.

this post was submitted on 12 May 2025
614 points (100.0% liked)

Just Post

855 readers
6 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS