19

It almost feels like spam at this point 😅

you are viewing a single comment's thread
view the rest of the comments
[-] univers3man@lemmy.world 13 points 1 day ago

Good. hopefully it will keep AI out of here. The only way I would be cool with AI is if it took 1000% less resources, and wasn't powered by companies building datacenters that produce power in teh most polluting way.

[-] pruwybn@discuss.tchncs.de 5 points 1 day ago

1000% less resources

Trump math is spreading

[-] starlinguk@lemmy.world 10 points 1 day ago

AI lies, spreads misinformation and steals. So no, I would not be cool with it if it took up fewer natural resources.

[-] univers3man@lemmy.world 1 points 1 day ago

Ideally, I was thinking nuclear or renewables. But the way things are now, I don't want it all. Just wishful thinking on my part.

[-] Lumisal@lemmy.world 1 points 1 day ago

So Deepseek run locally for LLMs or Flux.1 being run locally at 1080 for image gen?

[-] univers3man@lemmy.world 10 points 1 day ago

The problem I see with that is even though it's local (which is a huge step towards FOSS ownership instead of private hidden control), it still takes tons of energy to train the model itself. Not to mention the IP theft which is a whole 'nother issue.

[-] Lumisal@lemmy.world 1 points 1 day ago

If a single model uses 1Gw of energy (it doesn't) being trained but 300 million people use it, then it used 3.33 watts of energy per person, and that's assuming if it was only used a single time.

Using a 900w toaster 8 times to toast 2 slices of bread uses a bit more, at 3.6w - 0.45w for 3 minutes of toasting.

So, I'm not sure if that's true.

Also, you can use a model which didn't use IP theft, like Mistral, for LLM, or Photoshop for image gen. That is, if you consider the way it trains IP theft. But then, you'd be supporting corporations like Adobe.

[-] princessnorah 5 points 23 hours ago
[-] Lumisal@lemmy.world 1 points 22 hours ago

I know they had been paying Getty or some stock image company initially for the AI out painting, but I'm not surprised they're now pulling they shenanigan.

They're not stealing since it's part of their terms. But it's also definitely not ethical. If anything that's a valuable lesson to read the ToS.

And more the reason to not use Photoshop.

[-] princessnorah 3 points 22 hours ago

I'm perfectly happy to call it theft. At that point it's not like piracy, you're stealing the essence of the work and shoving in your uncanny valley machine.

[-] Lumisal@lemmy.world 1 points 22 hours ago

Wait, so you're okay with piracy?

Although, I definitely understand the distinction between piracy for an individual or household vs piracy for profit

[-] princessnorah 3 points 22 hours ago

I mean yeah, definitely? Corporations losing profit is much different to individual artists losing the ability to monetise their work.

[-] Nollij@sopuli.xyz 4 points 1 day ago

Watts (and gigawatts) are not a unit of energy. They are a unit of power, or you can think of it as a rate.

900 watts for an hour is 900 watt-hours, or 0.9 kWh. For 24 minutes (3 minutes x8) is 360 Wh, or 0.36kWh.

All of the major public LLM and diffusion models (ChatGPT, copilot, Grok, etc) are absolutely using more than a gigawatt. And I mean constantly. They are trying to create nuclear power plants exclusively to power an AI Datacenter. You could math out how much that is per query (not per person), but it's absolutely insane.

[-] Lumisal@lemmy.world 1 points 1 day ago

We were just talking about the energy used to train a model, not the usage itself.

I mentioned in a comment further down that usage would be significantly higher than training, because of the amount it's done, the hardware used, and the frequency.

[-] kbal@fedia.io 3 points 1 day ago

Advice that humans and bots could both heed more often: When somebody points out that your line of bullshit has become completely detached from reality it's best to act like a human being and admit it.

[-] univers3man@lemmy.world 3 points 1 day ago

I admit I didn't know that Mistral or Photoshop claim to use no copyrighted material, but I am loathe to support Adobe as you seem to correctly imply.

On the topic of power usage, we can assume 1Gw. But 300 million people using the same model as was trained via that inital 1Gw input seems like a stretch for as much as OpenAI / releases models / tweaks. And the root of my problem with the power draw is that it's not coming from clean or renewal sources so it's not just the 1Gw of usage, but all the pollution that comes with it. Not to mention the datacenters using water for evap cooling and taking water from towns.

[-] Lumisal@lemmy.world 1 points 1 day ago

Yes, 1Gw was a hyperbolic exaggeration.

I looked it up now out of curiosity and it's estimated training a state of the art LLM (image gen uses significantly less) at trillions of parameters uses about 10-20,000kw, which is 1-2% of 1Gw. So apparently if that model is used by 300 million people (which is less than the population of the USA, and it'd be accurate to say a popular model would have about that much usage if not more), it would actually be about 0.036-0.067w per person, or toasting bread for less than 10 seconds.

So training a model does use a lot of electricity, but considering how much it's used / how often, using it definitely generates more than training it I'd say.

I was also implying using the model locally on your own hardware rather than a data center. Local uses less energy because the hardware doesn't use as much power. It's also much slower, but it's also not destructive like an AI data center.

And yup, Adobe paid for the training data used. But, you know, Adobe. But ultimately, something large and centralized would be the only way to run the tech if we're expecting it to be useable as is.

My personal ethics are if it's used for personal use and not by a corporation, it's fine and ethical. After all, Linux is based of code very very few people get paid to make, if paid at all. If all those separate people had to start paying for each bit if code, Linux couldn't exist either. That said, I think compiling it all is it's own heavy work too. After all, just like the separate code won't spontaneously become a Linux OS, separate pieces of art/books won't spontaneously combine to make something new.

I donate and pay when I can, probably more so than most on Lemmy, for music , software, art, etc; even though it's hard for me to afford to. But if it's in public, it's strange to be surprised when someone uses it. After all, there's no reason to post anything you make online - that's a choice that was made.

What I do strongly disagree is a corporation (in particular large ones with plenty of money really) doing it for profit. Such as Meta did with pirating books.

[-] BussyCat@lemmy.world 6 points 1 day ago

Your units don’t make sense. Watts shouldn’t be used for a fixed energy usage it’s like saying a car drove across the U.S. and it did it at 4 gallons per hour.

The more useful metric to use is Gwh so chatgpt3 used 1.3 Gwh which isn’t bad but gpt4 used 62.3 Gwh in training plus an extra 1 Gwh per day

[-] Lumisal@lemmy.world 1 points 22 hours ago

I get what you're saying now.

That said, I still get the analogy by saying a car used 4 gallons per hour - it still indicates how fuel efficient something is. Especially if compared to something else

[-] BussyCat@lemmy.world 2 points 14 hours ago

But the problem is you don’t have anything to tie that to. if you have a car that gets 4 gph but goes 100 mph then it’s more efficient than a car that gets 3 gph but only goes 50mph but even with those you miss out on the actual efficiency which for a car is usually transporting people.

So if car A gets 4 gph at 100 mph and transports 2 people it gets 50 passenger miles per gallon of gas which is finally an actually useful metric

For LLMs that becomes much harder to quantify but a useful metric might be wh per minute of time saved or mL of water per minute of time saved. Unfortunately to quantify those you would need to do much more in depth analysis and probably also factor in false readings and time lost from that

[-] princessnorah 2 points 23 hours ago

Did you ask ChatGPT for these figures? Would explain why you're using nonsensical units.

this post was submitted on 07 Aug 2025
19 points (100.0% liked)

Showerthoughts

36457 readers
274 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS