7
Can I run local LLMs on Intel ARC/AMD with 8GB of RAM?
(lemmy.world)
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!
I see. I do not want to spend $1000+ on a GPU, I suppose this will have to wait a few years
Yup; hopefully there are some advances in the training space, but I’d guess that having large quantities of VRAM is always going to be necessary in some capacity for training specifically.
I hope some GPU manufacturer starts allowing removable RAMs. 4 x 8 GB DDR5 might not be too bad given PCIe speeds aren't a bottleneck. If I could upgrade the RAM to 64 GB later, I'm ready to give $10k at 3080 level perf. Intel ARC people I hope you are already doing this!
I don’t know anything about GPU design but expandable VRAM is a really interesting idea. Feels too consumer friendly for Nvidia and maybe even AMD though.
Yup; hopefully there are some advances in the training space, but I’d guess that having large quantities of VRAM is always going to be necessary in some capacity for training specifically.