Can we stop using the Steven Crowder meme already. The guy is a total chode.
Lol. He gives chodes a bad rep. Call him what he is. A christofascist misogynist grifter.
I don't really disagree, but I think that was the original intent of the meme; to show Crowder as a complete chode by having him assert really stupid, deeply unpopular ideas.
The meme's use has become too soft on Crowder lately, though, I think.
I notice lately that many memes origins are worse than I thought from the context they are used in. Racist, homophobic, and lying people are not something I usually accept as entertainment, but they sneak their way unnoticed into my (non-news) feed through memes. I guess most people won't know the origins of the meme and use it according to the meaning they formed on their own. Other memes like the distracted boyfriend meme are meaningless stock photos, so I understand why many people use memes without thinking about the origins.
Anyway, thanks for pointing out who the person in the picture actually is.
I must admit when I learned this was Crowder I had a sad
Just change and reupload :D
Oh please. There are better templates than this stupid Nazi cunt. I really don't want to see this fuckface.
Yes! This is a nice alternative template for example.
For the longest time I just thought he was that one guy from modern family.
I thought NixOS was the new Arch btw
From: https://knowyourmeme.com/memes/btw-i-use-arch
BTW I Use Arch is a catchphrase used to make fun of the type of person who feels superior because they use a more difficult Linux distribution.
I think that's fair to apply to some NixOS users. Source: BTW I use NixOS.
I mean the barrier of entry is kind of high if you're used to more traditional package managers.
Source: I tried using nix on my Debian machine
Damn you're kinda right
Steven Crowder is a despicable human and does not deserve a meme template.
I thought we were using the Calvin and Hobbes image now.
He is a despicable human. Point and laugh at the moron. Make an example out of him instead of trying to sanitize the internet.
Not a criticism of you but a little fun fact about him for others, he has a bunch of friends who "aren't" Nazis but calling themselves or have friends who like to call themselves stuff like "race realist".
At least the Arch people are not shilling for some corp.
I'm tired of people taking sides like companies give a shit about us. I wouldn't be surprised to see five comments saying something like "you shouldn't buy Nvidia AMD is open source" or "you should sell your card and get an amd card."
I'd say whatever you have is fine, it's better for the environment if you keep it for longer anyway. There are soo many people who parrot things without giving much though to an individuals situation or the complexity of a company's behavior. Every companies job is to maximize profit while minimizing loss.
Basically if everyone blindly chose AMD over Nvidia the roles would flip and AMD would start doing the things Nvidia is doing to maintain dominance, increase profit, reduce cost and Nvidia would start trying to gain more market share from AMD by opening up, becoming more consumer friendly, competitively priced
For individuals, selling your old card and buying a new AMD card for the same price will net you with a slower card in general or if you go used there is a good chance it doesn't work properly and the buyer ghosts you. I should know, I tried to get a used AMD card and it died every time I ran a GPU intensive game.
I also went the other way upgrading my mother's Nvidia card with a new AMD card that was three times as expensive as her Nvidia card ($50) would be on eBay and it runs a bit slower than her Nvidia card did. She was happy about the upgrade though because I used that Nvidia card in her movie server resulting in better live video transcoding than a cheap AMD card would.
Who is saying to sell your card so you can buy AMD?
You'd be surprised it happens quite a bit, especially when your trying to help an Nvidia user with technical issues.
The problems I see people have are kinda trivial and is usually fixed by installing a package or changing a kernel parameter. Stuff you spend a few minutes researching for a half second fix. It's like saying "Apple pages unintuitive to use? Throw out your MacBook and get a PC!"
I don't like MacBooks but I'm not going to tell them to replace it.
A collection of folks ranging from moralizing open source fans, Wayland aficionados, and AMD fanboy. They also like to blame any Linux problem the user might be having on Nvidia use even when the user hasn't actually mentioned Nvidia. Daughter got chlamydia? Shouldn't have gone with Nvidia.
I run Stable Diffusion with ROCm. Who needs CUDA?
What distro are you using? Been looking for an excuse to strain my 6900XT.
I started looking at getting it running on Void and it seemed like (at the time) there were a lot of specific version dependencies that made it awkward.
I suspect the right answer is to spin up a container, but I resent Docker's licensing BS too much for that. Surely by now there'd be a purpose built live image- write it to a flash drive, reboot, and boom, anime ~~vampire princes~~ hot girls
If you don’t like docker take a look at containerd and podman. I haven’t done any cuda with podman but it is supposed to work
I use stable diffusion on rocm in an ubuntu distrobox container. Super easy to set up and there's a good guide in the opensuse forum for it.
I can confirm that it works just fine for me. In my case I'm on Arch Linux btw and a 7900XTX, but it needed a few tweaks:
- Having xformers installed at all would sometimes break startup of stable-diffusion depending on the fork
- I had an internal and an external GPU, I want to set HIP_VISIBLE_DEVICE so that it only sees the correct one
- I had to update torch/torchvision and set HSA_OVERRIDE_GFX_VERSION
I threw what I did into https://github.com/icedream/sd-multiverse/blob/main/scripts/setup-venv.sh#L381-L386 to test several forks.
CUDA?! I barely even know'a!
Then show us your anime titty pics!
Earlier in my career, I compiled tensorflow with CUDA/cuDNN (NVIDIA) in one container and then in another machine and container compiled with ROCm (AMD) for cancerous tissue detection in computer vision tasks. GPU acceleration in training the model was significantly more performant with NVIDIA libraries.
It's not like you can't train deep neural networks without NVIDIA, but their deep learning libraries combined with tensor cores in Turing-era GPUs and later make things much faster.
AMD is catching up now. There are still performance differences, but they are probably not as big in the latest generation.
Brother of "I need nVidia for raytracing" while only playing last decade games.
I completely unironically know people who bought a 4090 exclusively to play League
Not gonna lie, raytracing is cooler on older games than it is newer ones. Newer games use a lot of smoke and mirrors to simulate raytracing, which means raytracing isn't as obvious of an upgrade, or can even be a downgrade depending on the scene. Older games, however, don't have as much smoke and mirrors so raytracing can offer more of an improvement.
Also, stylized games with raytracing are 10/10. Idk why, but applying rtx to highly stylized games always looks way cooler than on games with realistic graphics.
Quake 2 does looks pretty rad in RTX mode
Rocm is the AMD version
last I heard AMD is working on CUDA working on their GPUs and I saw a post saying it was pretty complete by now (although I myself don't keep up with that sort of stuff)
Well, right after that Nvidia amended their license agreements stating that you cannot use CUDA with any translation layers.
The project you're thinking of is ZLUDA.
NVIDIA finally being the whole bitch it seems, not unexpected when it comes to tech monopolies.
In the words of our lord and savior Linus Torvalds "NVIDIA, fuck you! 🖕", amen.
In all reality, a lot of individuals aren't gonna care when it comes to EULA B's unless they absolutely depend on it and this whole move has me want an AMD gpu even more.
If anything AMD (for ML) is the hardware "I use [x] btw" (as in I go through unnecessary pain for purism or to one up my own superiority complex)
I need NVDA for the gainz
Edit: btw Raspberry PI is doing an IPO later this year, bullish on AMD
My only regret for picking team red is that DaVinci Resolve doesn’t support hardware encoding.
Man I just built a new rig last November and went with nvidia specifically to run some niche scientific computing software that only targets CUDA. It took a bit of effort to get it to play nice, but it at least runs pretty well. Unfortunately, now I'm trying to update to KDE6 and play games and boy howdy are there graphics glitches. I really wish HPC academics would ditch CUDA for GPU acceleration, and maybe ifort + mkl while they're at it.
linuxmemes
I use Arch btw
Sister communities:
- LemmyMemes: Memes
- LemmyShitpost: Anything and everything goes.
- RISA: Star Trek memes and shitposts
Community rules
- Follow the site-wide rules and code of conduct
- Be civil
- Post Linux-related content
- No recent reposts
Please report posts and comments that break these rules!