306
Why does Nvidia hate linux?
(lemmy.ml)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
For what it's worth, NVIDIA's failings on Linux tend to be mostly in the desktop experience. As a compute device driven by cuda and not responsible for the display buffer, they work plenty good. Enterprise will not be running hardware GUI or DEs on the machines that do the AI work, if at all.
Even the old 1060 in my truenas scale server, has worked absolutely flawlessly with my jellyfin server.
They don't give a fuck about consumers these days and Linux being just a tiny fraction of the userbase, they give even less of a fuck.
I've had a bunch of issues with my GTX 1080 before I switched to an AMD RX 5700 XT. I love it, but I recently put the 1080 back in use for a headless game streaming server for my brother. It's been working really well, handling both rendering and encoding at 1080p without issue, so I guess I've arrived at the same conclusion. They don't really care about desktop usage, but once you're not directly interacting with a display server on an Nvidia GPU, it's fine.