558

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

all 43 comments
sorted by: hot top controversial new old
[-] narr1@lemmy.autism.place 88 points 1 month ago

Hah! Haha! Hahahaahah! Ties well with this one news article that I glimpsed that claims that by 2030 the need for fresh water will be 140% of the world's freshwater reserves. Infinite growth forever!

[-] frunch@lemmy.world 24 points 1 month ago

Time to buy stock in water lol

[-] SlopppyEngineer@lemmy.world 19 points 1 month ago

So, Nestlé stocks?

[-] BrianTheeBiscuiteer@lemmy.world 7 points 1 month ago

New admin will do its part by discouraging pregnancy and encouraging people to die sooner.

[-] bandwidthcrisis@lemmy.world 49 points 1 month ago

140Wh seems off.

It's possible to run an LLM on a moderately-powered gaming PC (even a Steam Deck).

Those consume power in the range of a few hundred watts and they can generate replies in a seconds, or maybe a minute or so. Power use throttles down when not actually working.

That means a home pc could generate dozens of email-sized texts an hour using a few hundred watt-hours.

I think that the article is missing some factor, such as how many parallel users the racks they're discussing can support.

[-] ikidd@lemmy.world 8 points 1 month ago

An article that thinks cooling is "consuming" should probably be questioned in all its claims.

[-] Soleos@lemmy.world 5 points 1 month ago

I think there's probably something wrong with the math around per-response water consumption, but it is true that evaporative cooling consumes potable water, in that the water cannot be reused until it cycles through the atmosphere and is recaptured from precipitation, same way you consume water by drinking and pissing it out, or agriculture consumes it for growing things. Fresh water usage is a major concern and bottleneck, especially with climate change. With the average data centre using 300k gallons of water per day, and Google's entire portfolio using 5bn gallons per day, it's not nothing.

[-] oldfart@lemm.ee 7 points 1 month ago

I would say a model like ChatGPT could use a bit more energy than 7B llama

[-] dan@upvote.au 3 points 1 month ago

I like that the 140Wh is the part you decided to question, not the "consumes 1 x 500ml bottle of water"

[-] bandwidthcrisis@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

That was covered pretty well already!

Or maybe it's using Fluidic logic.

[-] frunch@lemmy.world 48 points 1 month ago

I'm sure I'm missing out, but i have no interest in using chatbots and other LLMs etc. It floors me to see how much attention they get though, how much resources are being dumped into their development and use. Nuclear plants being reopened for the sake of AI?!!

I also assume there's a lot of things they're capable of that could be huge for science, and there's likely lots of big things happening behind closed doors that we're yet to see in the coming years. I know it's not all just chatbots.

The way this article strikes me though, is that it's pretty much just wasting resources for parlor-game level output. I don't know if i like the idea of people giving up their ability to write a basic letter or essay, not that my opinion on the matter is gonna change anything obviously 😅

[-] just_another_person@lemmy.world 27 points 1 month ago

Think of it like this: rich people accumulate more wealth by paying fewer people to accomplish more work faster, so it's worth burning through the worlds resources at breakneck speed to help the richies out, right?

[-] maplebar@lemmy.world 34 points 1 month ago

Mark my words: generative "AI" is the tech bubble of all tech bubbles.

It's an infinite supply of "content" in a world of finite demand. While fast, it is incredibly inefficient at creating anything, often including things with dubious quality at best. And finally, there seems to be very little consumer interest in paid-for, commercial generative AI services. A niche group of people are happy to use generative AI while it's available for free, but once companies start charging for access to services and datasets, the number of people who are interested in paying for it will obviously be significantly smaller.

Last I checked there was more than a TRILLION dollars of investment into generative AI across the US economy, with practically zero evidence of genuinely profitable business models that could ever lead to any return on investment. The entire thing is a giant money pit, and I don't see any way in which someone doesn't get left holding the $1,000,000,000,000 generative AI bag.

[-] AbsoluteChicagoDog@lemm.ee 11 points 1 month ago

Don't worry, we'll bail them out once the bubble bursts.

[-] vinnymac@lemmy.world 33 points 1 month ago

Why does the article make it sound like cooling a data center results in constant water loss? Is this not a closed loop system?

I’m imagining a giant reservoir heat sink that runs throughout a complex to pull heat out of the surrounding environment where some liquid evaporates and needs to be replenished. But first of all we have more efficient liquid coolants, and second that would be a very lazy solution.

I wonder if they’ve considered geothermal for new data centers. You can run a geothermal loop in reverse and use the earth as a giant heat sink. It’s not water in the loop, it’s refrigerant, and it only needs to be replaced when you find the efficiency dropping, which can take decades.

[-] Munkisquisher@lemmy.nz 11 points 1 month ago

Evaporative coolers save a ton of energy compared to refrigerator cycle closed loop systems. Like a swamp cooler, the hot liquid that comes from cooling the server is exposed to the atmosphere and enough evaporates off to cool the liquid by a decent percentage, then it's refrigerated before going back into the servers.

Data centre near me is using it and the fire service is used to be being called by people concerned the huge clouds of water vapor are smoke

[-] JPAKx4 7 points 1 month ago

It highly depends on every data center, but it is very likely that they do use municipal water for cooling. Mainting a Reservoir is extremely expensive for the amount of thermal mass it requires, these things kick off HEAT.

[-] bobs_monkey@lemm.ee 2 points 1 month ago

I don't know why they aren't using reclaimed water from treatment plants. I don't see why potable water is necessary as long as the substitute isn't corrosive, but I might be missing something here.

[-] catloaf@lemm.ee 2 points 1 month ago

You'd have to get the gray water in, and it's more efficient to just continue treating it and using the municipal water system.

[-] someguy3@lemmy.world 6 points 1 month ago

You can run a geothermal loop in reverse and use the earth as a giant heat sink.

You need something to move the heat away, like water or air. Having something solid that just absorbs will reach its heat capacity pretty quick.

[-] WrenFeathers@lemmy.world 23 points 1 month ago

Can we PLEASE shut that shit down? We were doing just fine without it.

[-] stinky@redlemmy.com 1 points 1 month ago

You mean you were doing just fine without it.

You don't speak for the entire human race friendo. You don't get to decide what happens to us, and thank God. You seem too emotional and selfish to be any good at leadership.

[-] WrenFeathers@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

Emotional and selfish? Right. Sooo…

• AI is ruining the environment and has yet to show any positive reason for it
• AI is taking jobs from people
• AI is destroying our art and our entertainment

But according to you…. I’m selfish for wanting to stop it.

And where do you get the idea that I’m being to emotional? Is it just that you thought it would help you by removing any validation from my statement?

How about this:

YOU don’t get to speak for me, friendo. You don’t get to decide if I’m emotional. And thank god. You seem too ignorant to be any good at psychological diagnoses.

[-] stinky@redlemmy.com 2 points 1 month ago
[-] AkatsukiLevi@lemmy.world 17 points 1 month ago
[-] zerozaku@lemmy.world 16 points 1 month ago

I have read the comments here and all I understand from my small brain is that, because we are using bigger models which are online, for simple tasks, this huge unnecessary power consumption is happening.

So, can the on-device NPUs we are getting on flagship mobile phones solve these problems, as we can do most of those simple tasks offline on-device?

[-] WolfLink@sh.itjust.works 8 points 1 month ago* (last edited 1 month ago)

I’ve run an LLM on my desktop GPU and gotten decent results, albeit not nearly as good as what ChatGPT will get you.

Probably used less than 0.1Wh per response.

[-] Monsieurmouche@lemmy.world 1 points 1 month ago

Is this for inferencing only? Do you include training?

[-] WolfLink@sh.itjust.works 2 points 1 month ago

Inference only. I’m looking into doing some fine tuning. Training from scratch is another story.

[-] avieshek@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

Yes, kind of… when those businesses making money out of the subscriptions are willing to ship with the OS for free which something only Apple has the luxury to do instead of OpenAI who doesn’t ship hardware or software (like Windows) beyond an app that’s less than 100MB. Servers would still be needed but not for general cases like help me solve this math or translation. Stable Diffusion or Flux is one example where you only need the connection to internet when downloading a certain model like you wouldn’t necessarily want to download every kind of game in the world when the intention is to play games arises.

[-] bruhduh@lemmy.world 14 points 1 month ago

🥵🥵🥵🔥🔥🔥💦💦💦

[-] DuckWrangler9000@lemmy.world 13 points 1 month ago

These article titles are so crazy. Who thinks of this stuff?

[-] JasonDJ@lemmy.zip 11 points 1 month ago
[-] gzerod200@lemmy.world 10 points 1 month ago

Am I going insane? As far as I know cooling with water doesn’t consume the water, it just cycles through the system again. If anyone knows otherwise PLEASE tell me.

[-] Uncut_Lemon@lemmy.world 6 points 1 month ago

Industrial HVAC systems use water towers to cool the hot side of system. The method relies on physics of evaporative cooling to reduce temperatures of the water. The process requires water to be absorbed by atmosphere, to drive the cooling effect. (Lower the humidity, the higher the cooling efficiency is, as the air as greater potential to absorb and hold moisture).

The method is somewhat similar to power station cooling towers. Or even swamp coolers. (An odd example would be, experimental PC water cooling builds with 'bong coolers', which are evaporative coolers, built from drainage pipes)

[-] nutsack@lemmy.world 3 points 1 month ago

yea i really don't know when or why they started measuring electricity in water

[-] Zementid@feddit.nl 1 points 1 month ago

Maybe it's a valid measure in the future, albeit 500ml would be enough to power New York for a day (the state) by means of fusion.

[-] nutsack@lemmy.world 5 points 1 month ago

perplexity.ai says that one chat GPT query consumes half a liter of water O_O

im imagining a rack of servers just shooting out a fire hose of water directly into the garbage 24 hours a day

[-] a4ng3l@lemmy.world 8 points 1 month ago

The real surprise for me is how little the battery of my iphone holds. Especially compared to my ev6 or what my heat pump guzzles daily. Crazy.

[-] ohwhatfollyisman@lemmy.world 6 points 1 month ago

yeah, but it can do really cool things like "suggest a name for my project that does X".

surely that game's worth the candle, yes?

[-] OmegaLemmy@discuss.online 1 points 1 month ago

Oh you don't mean... Oh yeah totally that's awfulllll like thirsty... Yeah...

this post was submitted on 23 Nov 2024
558 points (100.0% liked)

Technology

60341 readers
3572 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS