[-] d0nkey@lemmy.zip 75 points 1 week ago

Seriously though, the thought of an llm running on a server interacting with my files is one of the most frightening things I can imagine happening to anyone

[-] d0nkey@lemmy.zip 4 points 2 weeks ago

Talk to the hand

[-] d0nkey@lemmy.zip 9 points 1 month ago

Are you sure everything is in one single binary and the images are not hidden in a folder somewhere on the drive?

[-] d0nkey@lemmy.zip 4 points 1 month ago

But still no one is willing to move to another platform. I have talked to many of my friends about moving to a more secure platform for messaging or social media, but nobody is willing to move because everyone they know is on a platform developed by meta. And I think they are right, it is impossible to make everyone individually switch, we need new rules and regulations being put in place in order to at least protect some parts of the lives of us and everyone around us.

[-] d0nkey@lemmy.zip 2 points 1 month ago

What gpu are you running models on?

[-] d0nkey@lemmy.zip 2 points 1 month ago

I have used the micro variant primarily with perplexica and I must say it is really good for summation and for answering further questions, especially when it comes to these tasks in my testing it has outclassed instruct models that are 2-3 times its size.

[-] d0nkey@lemmy.zip 6 points 1 month ago* (last edited 1 month ago)

Hey at least you remember most of it, maybe not the parts of it you should remember, but still most of it

[-] d0nkey@lemmy.zip 6 points 1 month ago

I will find you. When I find you I am going to torture you until you cannot handle it anymore. And when you cannot handle it anymore you will die. But when you die you will not relax, because I will be there with you.

d0nkey

joined 3 months ago