978
you are viewing a single comment's thread
view the rest of the comments
[-] SoftestSapphic@lemmy.world 30 points 1 month ago* (last edited 1 month ago)

You can also download it and run a local version where you remove all the cesnsors for free

[-] AtHeartEngineer@lemmy.world 6 points 1 month ago

I haven't seen a way to do that that doesn't wreck the model

[-] Speculater@lemmy.world 6 points 1 month ago

Kccp, hugging face, grab a model that fits your vram in gguf format. I think two clicks after downloaded.

[-] AtHeartEngineer@lemmy.world 8 points 1 month ago

I know how to download and run models, what I'm saying is, all the "uncensored" deepseek models are abliterated and perform worse

[-] Shezzagrad@lemmy.ml 1 points 1 month ago

It's the same model, your pc just sucks lmfao

[-] AtHeartEngineer@lemmy.world 8 points 1 month ago

I'm not talking about the speed I'm talking about the quality of output. I don't think you understand how these models are transformed into "uncensored models" but a lot of the time using abliteration messed then up.

[-] Shezzagrad@lemmy.ml 1 points 1 month ago

Buddy I have a running and been testing 7b and 14b compared to the cloud deepseek. Any sources, any evidence to back what you're saying? Or just removed and complaining?

[-] AtHeartEngineer@lemmy.world 3 points 1 month ago

I'm not talking about the cloud version at all. I'm talking about the 32b and 14b models vs ones people have "uncensored".

I was hoping someone knew of an "uncensored" version of deepseek that was good, that could run locally, because I haven't seen one.

I don't know what you mean by "removed".

[-] 474D@lemmy.world 2 points 1 month ago

You can do it in LM Studio in like 5 clicks, I'm currently using it.

[-] AtHeartEngineer@lemmy.world 4 points 1 month ago

Running an uncensored deepseek model that doesn't perform significantly worse than the regular deepseek models? I know how to dl and run models, I haven't seen an uncensored deepseek model that performs as well as the baseline deepseek model

[-] 474D@lemmy.world 1 points 1 month ago

I mean obviously you need to run a lower parameter model locally, that's not a fault of the model, it's just not having the same computational power

[-] AtHeartEngineer@lemmy.world 2 points 1 month ago

In both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face

[-] jaschen@lemm.ee 2 points 1 month ago

That is not true. I also download the model and it is hard coded in

[-] manicdave@feddit.uk 1 points 1 month ago
[-] jaschen@lemm.ee 1 points 1 month ago

That's if you know to break the model. China wants to control the narrative and create lies around their fake reality.

It may seem like they want to use this as a psyops tool.

this post was submitted on 01 Feb 2025
978 points (100.0% liked)

Political Memes

1479 readers
442 users here now

Non political memes: !memes@sopuli.xyz

founded 2 years ago
MODERATORS