110
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 25 Aug 2025
110 points (100.0% liked)
Linux
9310 readers
333 users here now
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
founded 2 years ago
MODERATORS
For gaming, yes.
But the Framework Desktop seems to be made for a different usecase:
Unified means both CPU and GPU have access to it. Why would you need so much RAM and why would you care if the GPU has direct access?
Neural Networks. You can fit a pretty serious LLM into 128GB of RAM, and if GPU has direct access, still run inference at reasonable performance.
I love my 9070XT, but you can't run anything approaching what Claude or ChatGPT gives you in just 16GB of VRAM
They specifically advertise it as a gaming device. The author of this article is using it as a gaming device.
LLMs are fucking stupid and no one should use them.
You are absolutely right. Baffles me why they'd put 128GB of RAM in there and use an SoC arhitecture where RAM is shared with GPU, to the detriment of upgradability, if not for AI.
Any gamer would prefer upgradable RAM and upgradable GPU, especially from Framework.
How else would you explain this decision to compromise their brand values and overspend on RAM, if not AI?
... I mean... 128 gb of shared ram could be used that way...
But... you really think this is a common enough use case to design a pc around?
Like... how many people want to game, vs... run / train a local LLM?
Is this what Framework is doing, making LLM development boxes?
Not... trying to make an affordable, general use oriented, 'buy once and then upgrade with parts/modules as you prefer'?