109
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 15 Nov 2024
109 points (100.0% liked)
Futurology
2014 readers
147 users here now
founded 2 years ago
MODERATORS
Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.
Or use a laptop with a GPU? An npu seems to just be slightly upgraded onboard graphics.
It’s a power efficiency thing. According to the article, a GPU gets the job done, but uses more energy to get there. Probably not a big deal unless charging opportunities are scarce.