227
Proton's very biased article on Deepseek
(lemmy.ml)
This is a most excellent place for technology news and articles.
There are plenty of ways and they are all safe. Don't think of DeepSeek as anything more than a (extremely large, like bigger than a AAA) videogame. It does take resources, e.g disk space and RAM and GPU VRAM (if you have some) but you can use "just" the weights and thus the executable might come from another project, an open-source one that will not "phone home" (assuming that's your worry).
I detail this kind of things and more in https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence but to be more pragmatic I'd recommend
ollama
which supports https://ollama.com/library/deepseek-r1So, assuming you have a relatively entry level computer you can install
ollama
thenollama run deepseek-r1:1.5b
and try.FWIW I did just try
deepseek-r1:1.5b
(the smallest model available viaollama
today) and ... not bad at all for 1.1Gb!It's still AI BS generating slop without "thinking" at all ... but from the few tests I ran, it might be one of the "least worst" smaller model I tried.