I could barely launch games on Wayland 2 years ago. I have been gaming on Wayland now, no problems. So if anything, it has massively improved.
I’m on Arch, with Hyprland as my Window Manager. I use an RTX 3070.
For Wayland specifically, the driver was next to unusable for a while. I jumped ship from Windows in Sept. 2023. Beginning with driver 560 iirc, it got a lot better, plus their engineers pushed a lot of changes across the Wayland ecosystem to implement explicit sync support (a net positive, but before this, Nvidia was too stubborn to implement implicit sync, so bad screen tearing was unavoidable). Also there’s been a slow migration to using the GSP processor on newer cards. They claim it can improve performance, which may be true, but I also recently learned it helps them keep some more parts of their code closed-source, which is likely why it’s required to use the open source kernel modules.
At this point, though, it does feel very smooth and I can play games like The Finals at competitive framerates!
But relative to my performance under Windows, it’s still worse, mainly in average framerate. Like others have said, DX12 games seem to be hit hardest. I sometimes have to run lower settings to compensate. Also, if my VRAM gets filled, Xwayland apps all break, so I have to be very careful with higher quality texture quality especially.
Anyways, to answer your question, I think an average gamer doesn’t notice the degraded performance, without benchmarking or comparing framerates back to back— it still runs pretty smooth and framerates are still pretty high. If they aren’t happy with it, they’ll drop quality settings or resolution, just like they’d do under Windows.
The drivers run OK, but because they were not built for my distribution with the right flag, when I sleep and resume my system, I need to log out and back in to the desktop or else it bugs out.
Is this the drivers fault for not having that be a default flag? The maintainers fault for not using the correct flags? Waylands fault for not interfacing with the driver right on resume? My fault for having the audacity to want to use the sleep function?
I have no clue, but it doesn't happen with the open source nouevau drivers, so I'm inclined to place a fair bit of the blame with nvidia.
I have 3080 and I've seen significant performance issues too (e.g. in Cyberpunk 2077, KCD2). I think it depends a lot on the games you play. Apparently DX12 (via vkd3d) doesn't perform well on Nvidia cards.
My next GPU will probably not be an Nvidia card.
Well being it’s all vibe coded now yep that tracks.
Source please
Just take a look at their workflow and you tell me. NVIDIA uses an AI-powered code editor called Cursor. It is a fork of VS Code that integrates Large Language Models (LLMs) directly into the programming environment.
Here a quote from this article:
A high-profile endorsement from NVIDIA’s CEO Jensen Huang in late 2025 underscored Cursor’s rapid ascent – calling Cursor his “favorite enterprise AI service” and noting that 100% of NVIDIA’s engineers now use AI assistance with a remarkable boost in productivity.
In late 2024 and early 2025, CEO Jensen Huang explicitly stated that "every single software engineer at NVIDIA uses Cursor."
And then there is this article:
https://fortune.com/2025/11/25/nvidia-jensen-huang-insane-to-not-use-ai-for-every-task-possible/
So it’s kinda obvious they use a lot of AI in their code. Now there is no direct proof they use it in their drivers but they obviously do given the CEO’s stance.
No doubt NVIDIA is peddling AI as they are financially depending on it now.
Now from claiming something is powerful and even used to actually shipping code on something low level and benchmarkable like (GPU) drivers I have doubt. I imagine they can say they use AI there to rephrase comment and it would "technically correct" but beyond that I'm still skeptical.
Regarding chip design, AI has been used for decades ... if you consider routing to be AI. It's not generative in the modern sense, it's not using LLM, but it's automated a process.
To me it's the typical Harvard Business School playbook. C-suite repeat keywords they read in their peer most popular magazine, they aggregate in a document they call "strategy" they lower down the chain of commands people "execute" that because they must, thanks to KPIs.
I'd love to hear it from an actual engineer working on drivers but I imagine it'd be hard to get a honest opinion with NDAs and all.
Thanks for providing all the sources!
I think the biggest issue they had was with consistency of frame rate. Maybe the graphic drivers have matured more for the RTX 30 than the later versions. Are you using the proprietary drivers, or the open source ones?
I have always used what PopOS bundles. It used to be the proprietary driver, but ever since a certain driver version, they have switched to the Nvidia-made FOSS driver. Because nvidia stopped developing any sort of proprietary components in the driver and just made the FOSS driver instead, which became the "official" Nvidia driver in May 2025 i think.
Edit: Correction, only the kernel modules are all FOSS, while the userspace modules such as CUDA for example are still proprietary.
The only thing I know for sure in regards to why nvidia kinda blows on linux is that system RAM sharing doesn't really work the way it does on windows or with AMD cards on linux. This means that particularly VRAM intense workloads will suffer. Unreal 5 games in particular suffer because of this.
I have a 5060ti 16gb version and don't experience any issues with frame pacing and stuff like that on nixos, while when I tried bazzite I was experiencing some issues with the gpu.
It's most likely a bazzite specific issue, especially if you use something like the steamos interface.
For example my monitor is a 160hz monitor that support's gsync, but if I enable global gsync in any wayland compositor the bottom part of my screen is cut off and experience weird vissual bugs, but if I make them only enable gsync on my steam games when they are fullscreen I don't experience that issue. The steamos session inside of bazzite doesn't have that feature, so when it enables gsync I experience that same issue of cut off part of the screen with weird visual issues.
The problem for me is that its so damn back and forth. One driver will be perfectly fine, and then the next will introduce some random new issue. Rn 580 has caused elite dangerous to crash with some out of memory error, and VR is completely broken. Im really considering switching from Nobara because I want a distro that has a built in tool to easily downgrade nvidia drivers.
I think most likely the performance isn't as similar on your system as you think it is.
No idea, I've been playing "normal" games and VR games with my NVIDIA card for years now on Debian and... it just works. It just keeps on working. Maybe people are hardcore tinkerers that mess with specific options but me, I just play and I'm happy with it.
I feel that "works" is a very broad term. It can work but doesn't mean that it will perform even or better in a good amount of games compared to Nvidia on Windows or AMD in Linux or Windows. Issues depends on specific games that one plays, one may be playing games that aren't prone for those issues reported.
Sure, but FWIW I play from AAA to indies and it "works" as in no bug, no noticeable visual glitch.
I don't benchmark from my driver version to the previous one on Windows or Linux or a price point equivalent with AMD hardware, I just play. I don't think anybody gain much from checking performance benchmarks before playing a game, at least I can say for sure to me that's not part of the fun.
I would notice if something was blatantly wrong e.g 50% performance hit, but I wouldn't if it's 5% hit. I don't really care for it as it doesn't affect my gameplay. Like I said, it's from a casual player, not a pro player nor a game tinkerer.
Working "better" on Windows means nothing to me. Either I can play and I'm happy or I can't (which never happened) then I'd be disappointed and potentially check why.
PS: I'm also a developer of XR content so I'm relatively confident I'd spot any significant problem.
It’s a bazzite issue. Many people share that the drivers are pretty reliable on other distros.
From experience, they got pretty good over last 2-3 years.
Does Bazzite make their own Linux drivers?
No, bazzite does some weird stuff that does not interact well with the linux nvidia drivers
In the past (2021-2022) I remember that Nvidia drivers were really pain in the ass and installing them always ended for me with black screen. For right now everything is working for me in debian-based distributions and arch-linux. Yesterday I was looking for new linux distros, I wanted to try Fedora and OpenSuse but drivers didn't work there at all. In Fedora this Nouveau driver was okay but in OpenSuse even open-source driver wasn't working.
All this is ironic, bazzite is often chosen for gaming, but as all these niche distros they have problems in stability which in this case reflects on exactly their core feature: gaming performance.
Stands yet again in demonstrating that gamers don't actually care about performance, just in claiming that they have a top spec pc, if that thing does not perform like it should but just gives a workable experience it is actually fine
I just went to repurpose some old hardware for my nephew (4790k + 32gb ddr3 + rtx 3050) which I thought would make a very passable bazzite box. I put 2 drives in the test rig, one with bazzite Nvidia + kde and one with win11 running with the rufus tpm bypass hacks.
CS2 ran at ~40fps in bazzite with no sound once you got in game, win11 ran at ~100
Helldivers2 ran at ~50fps in bazzite with constant frame drops even after letting it precompile shaders. On windows it was a very playable 70fps.
I mainline Linux myself and I wanted bazzite to be the set-and-forget answer but it really wasn't. I can't in good faith hand that build over to an 12 year old with bazzite and that was super disappointing.
Problem is not Bazzite or Linux, problem is Nvidia. If we need to be honest, we should avoid Nvidia.
Like the other guy said I think this is a bazzite-induced problem. I have other Linux systems at home. My daily driver and my wife's daily driver are both highly custom Ubuntu server derivatives, we both have Nvidia GPUs (3050, 5070), and neither of us have similar issues.
The reason I wanted to try bazzite was that I didn't want to remotely support something super custom.
Do you think you would have those issues that you are reporting if you were using AMD or Intel GPUs?
I suspect the difference in experiences is more due to x11/pulse(my custom systems) vs Wayland/pipewire(bazzite) than it is any particular GPU vendor or driver branch. Which I guess is a roundabout way of saying
Maybe? Probably?
Judging by the protondb entry on CS2 I strongly suspect I would have at least the audio issue regardless of gpu.
I haven't noticed.
No, but then I have quite an old graphics card. Even for my card, the open source drivers are recommended.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0