107
submitted 4 days ago by ryujin470@fedia.io to c/asklemmy@lemmy.world

They use HBM (High Bandwidth Memory). PCs, laptops and phones don't use this type of RAM.

top 28 comments
sorted by: hot top controversial new old
[-] CaptainBasculin@lemmy.dbzer0.com 130 points 4 days ago

Manufacturers have a certain amount of chips they can manufacture, let's say they can manufacture 10 million chips per year. Normally they adjust for demand, like manufacture 7 million chips for consumers and 3 million for enterprise customers. Company A contacts them and says "We need 6 million chips for this year, here's the money", But the factories still can manufacture only 10 million chips in total, so they adjust their factories to manufacture more enterprise focused chips, decreasing the amount of chips manufactured used by consumers.

[-] Hawke@lemmy.world 45 points 4 days ago

I think it’s more like “we need 6 million chips for this year, and we’ll pay you eventually, maybe”.

But yeah the outcome is the same.

[-] null@piefed.nullspace.lol 3 points 4 days ago

How does that work? Are they taking out a loan?

[-] Hawke@lemmy.world 19 points 4 days ago

Last time I went round with someone on this, they insisted that this is just normal business procedure to order product on credit.

To me, while trade credit is definitely a thing, for orders this large I expect there’s some more substantial backing for it.

Ultimately it probably fits in somewhere on a chart of financial shenanigans like this one:

[-] null@piefed.nullspace.lol 2 points 4 days ago

That's more what I was wondering. I'm sure most big customers can get a contract that lets them pay over time, but this particular bubble feels significantly out of the ordinary.

If I'm a RAM manufacturer, there's got to be some kind of guarantee that makes me me confident enough that I'll see that money in the end if I'm putting that many eggs in one basket.

[-] village604@adultswim.fan 2 points 4 days ago

I mean, contracts are how you get that guarantee.

[-] null@piefed.nullspace.lol 1 points 4 days ago

But what is motivating RAM manufacturers to sign that contract? Why can the commenter above figure it out, but they can't?

[-] village604@adultswim.fan 2 points 4 days ago

Because the contract would be an agreement for the purchasing company to pay for the products they ordered to be manufactured.

[-] null@piefed.nullspace.lol 1 points 4 days ago

Are you being deliberately obtuse?

[-] Appoxo@lemmy.dbzer0.com 18 points 3 days ago

Production capacity.
Somewhere I read that to produce one HBM module you could produce three regular DDR modules

[-] Treczoks@lemmy.world 42 points 4 days ago

Because the manufacturers use the facilities that once produced DDR5 RAM chips to produce HBM chips instead.

[-] coolie4@lemmy.world 47 points 4 days ago

Without even getting into the electronics similarities, they use common raw materials and manufacturing facilities. Diverting resources to one lowers supply of the other, affecting costs.

[-] brucethemoose@lemmy.world 17 points 4 days ago* (last edited 4 days ago)

To add to what others said:

LPDDRX is used in some inference hardware. The same stuff you find in laptops and smartphones.

Also, the servers need a whole lot of regular CPU DIMMs since they're still mostly EPYC/Xeon severs with 8 GPUs in each. And why are they "wasting" so much RAM on CPU RAM that isn't really needed, you ask? Same reason as a lot of AI: it's immediately accessible, already targeted by devs, and AI dev is way more conservative and wasteful than you'd think.

Same for SSDs. Regular old servers (including AI servers) need it too. In a perfect world they'd use centralized storage for images/weights with near-"diskless" inference/training servers. Some AI servers do this, but most don't.


Basically, the waste is tremendous, for the same reason they use cheap gas generators on-site: it's faster-to-market.

[-] stringere@sh.itjust.works 4 points 3 days ago

they use cheap gas generators

It only just now occurred to me how much the war in Iran is also fucking over AI companies.

[-] brucethemoose@lemmy.world 2 points 3 days ago

Hardly. Power costs are trivial to them at the moment, and a server hardware bottleneck would just consolidate power to the big few that can afford it (which is what they want).

[-] BeardededSquidward 2 points 3 days ago

Happy giggles.

[-] gdog05@lemmy.world 26 points 4 days ago

The better question is, why are they doing all of this without an actual purchase contract?

[-] kbal@fedia.io 26 points 4 days ago

Because it's AI, haven't you heard? Does it make sense for the business? Who cares, it's AI. Is it financially sustainable? Dude, it's AI though. Will there be any customers for any of it? The AI says there will be. You've got to understand, this is AI we're talking about. It's the AI revolution that will transform the world. We've got to bet everything on the AI, or we'll be left out of the AI future. I asked the AI and it was very clear about that.

[-] Asafum@lemmy.world 20 points 4 days ago

Hey @grok is this true?

Grok: yes, I am very great. I am good at everything because I am AI. Also, fuck the Jews.

[-] amio@lemmy.world 6 points 4 days ago

Because people are stupid and market hype is dumber still.

[-] Robin@lemmy.world 15 points 4 days ago

Besides what others have already said, HBM is only used for the GPUs. These AI servers also use regular DDR5 chips, just with an extra EEC chip.

[-] kbal@fedia.io 11 points 4 days ago

The type of RAM that they use is different in that it takes up even more of all the things that would otherwise be used to produce the RAM that you use.

[-] ianhclark510 9 points 4 days ago

Same Fabs dog

[-] lordnikon@lemmy.world 9 points 4 days ago

wafer shortage till 2030 are both used by DCs and Consumer RAM modules.

[-] kmirl@lemmy.world 6 points 4 days ago

If RAM and GPUs were cheap people like us would be more likely to set up local LLMs to prevent our data from being productized by power-grabbing corporations.

[-] AmidFuror@fedia.io 4 points 4 days ago

The actual explanation is much simpler.

[-] kmirl@lemmy.world 2 points 4 days ago

Not claiming it's the reason since it clearly isn't, only that it will help drive traffic to commercial AI products.

[-] village604@adultswim.fan 4 points 4 days ago

I think it's more likely that they're setting up to push VDI.

The vast majority of consumers would not be able to set up a local LLM, and they know the people who are able to do so aren't going to use their services in the first place.

this post was submitted on 08 Apr 2026
107 points (100.0% liked)

Ask Lemmy

39067 readers
2350 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS