44

I was trying out FSR4 on my RX 6800 XT, Fedora 42. Works really well and it easily beats FSR3 in visuals even on Performance. It does have a significant performance hit vs FSR3 though but it still works out to be a bit faster than a native rendering on Quality.

top 42 comments
sorted by: hot top controversial new old
[-] kattfisk@lemmy.dbzer0.com 2 points 1 day ago

Nice! A big improvement indeed.

I wished you had showed them with similar sharpness settings though. The FSR 3 image is very oversharpened, while the FSR 4 one has the opposite problem so you can't really compare any details.

[-] WereCat@lemmy.world 2 points 1 day ago

I've only noticed that sharpness was not the same after I've uploaded the video. Sorry.

[-] Hond@piefed.social 5 points 3 days ago

Neat!

I didnt know RDNA2 would work with FSR4 too.

[-] WereCat@lemmy.world 3 points 3 days ago

With the Int8 model this should work on older cardd as well as on NVIDIA and Intel

[-] ElectroLisa 2 points 3 days ago

How were you able to get FSR4 on RDNA2? Is it a mod for Cyberpunk or a custom Proton version?

[-] WereCat@lemmy.world 10 points 3 days ago* (last edited 3 days ago)

there is a modified .dll you can use to replace the one in a game folder… AMD leaked it accidentally when they were releasing some open source stuff

I can send you a link tomorrow or upload it, Im not at my PC right now

edit:

here is link https://gofile.io/d/fiyGuj

you need to rename it to amd_fidelityfx_dx12.dll and replace the one in the game folder and it should work (in Cyberpunk). I had to use OptiScaler for Hogwards Legacy as just replacing the .dll made the game crash on launch and it was necessary to spoof it as DLSS

[-] juipeltje@lemmy.world 3 points 3 days ago

That's awesome. Not a fan of using upscaling tech generally but since they keep trying to improve it, i might give this a try on my 6950xt out of curiousity.

[-] ElectroLisa 1 points 3 days ago

I'll try this out tomorrow, thanks for the DLL.

Have you tried any games with no official FSR4 support, like Grounded?

[-] WereCat@lemmy.world 3 points 3 days ago* (last edited 3 days ago)

Baldurs Gate 3 AFAIK does not officially support FSR4 and this works with it with OptiScaler (I've tried on Steam Deck). Wanted to try on PC as well but game has updated to the official Linux supported version and this does not work with it because it’s Vulkan only now. My internet is slow so I can’t be bothered to redownloadalmost 100GB just to downgrade the game version. Will have to probably check what’s in my library.

[-] SitD@lemy.lol 1 points 3 days ago* (last edited 3 days ago)

you used the int8 quantized leaked version right? i thought the f8 version doesn't run on rdna2

also i wondered if the fsr4 feels like bigger input lag, can you tell?

[-] DarkAri 1 points 3 days ago

If it drops the real frame rate more than FSR2, which it does, then yes, you will have more input lag.

[-] WereCat@lemmy.world 2 points 3 days ago* (last edited 3 days ago)

Yes, it's the INT8, not FP8 version.

Why would FSR had anything to do with input lag? The only reason why input lag would increase is due to FSR4 being more difficult to run on RDNA2 which would be due to lower FPS as FPS is also directly tied to input lag.

But we are talking about 120FPS vs 150FPS here when comparing Quality Presets so I doubt you could even tell. And even if you can, just lower the preset, it will still look better and get you to the same performance.

From multiple games I've tested so far my conclusion is that I am almost always CPU limited in most games even with 5800X3D (in CP2077, Hogwards Legacy, Kingdom Come Deliverance 2), most areas are CPU heavy due to a lot of NPCs and FPS drops in those areas enough where my GPU is bored, the only benefit of FSR in those areas is that FSR4 looks better but wont yield any performance benefits.

[-] victorz@lemmy.world 3 points 3 days ago

Input lag is caused by frame interpolation, right? Or nah?

[-] DarkAri 4 points 3 days ago* (last edited 3 days ago)

It's because game logic is calculated on real frames and these things lower the real frame rate even though they give you more rendered frames. If you were getting 40 real FPS, and then you go to 30 real fps, you will feel a significant amount of lag even if you are getting 60 fps in fake frames. Basically the game loop is running slower and stuff like input polling is happening slower even if you have a higher frame rate.

[-] kattfisk@lemmy.dbzer0.com 2 points 2 days ago

Framegen is worse the lower your base frame rate is.

The penalty to the speed at which the game runs is much more significant, if you normally run at 40 fps and framegen gives you 60 (30 real) then you have introduced 8 ms of latency just from that. While the same 25% performance cost going from 180 fps to 270 (135 real) adds just 2 ms.

The lower your real frame rate is the harder it will be to interpolate between frames because the changes between frames are much larger, so it will look worse. Also the lower your frame rate the longer any mishaps will remain on screen, making them more apparent.

[-] DarkAri 1 points 1 day ago

True, Ig I was trying to teach people about why FSR often causes input lag. If you are getting 120 real frames then yeah, it probably won't matter much, but if you are getting less then 60 real frames it's going to be a worse experience unless you are playing menu games or something.

[-] kattfisk@lemmy.dbzer0.com 2 points 1 day ago

Yeah I just wanted to illustrate that with some numbers :)

It's a bit counter-intuitive that frame generation is worse the lower your base frame rate is. And Nvidia in particular has no interest in making it clear that this tool is only really good for making a well-running game run even better, and is not going to give your 5070 "4090 performance" in any meaningful way.

[-] victorz@lemmy.world 1 points 2 days ago

Frame generation shouldn't be a bottleneck on the CPU though, should it? That stuff is happening on the GPU. I know I saw a video about this stuff but I can't remember the real reason input lag increases with frame generation/interpolation.

[-] kattfisk@lemmy.dbzer0.com 1 points 2 days ago

Most games aren't bottlenecked by your CPU at all. It spends a lot of time waiting for the GPU to be done drawing you a picture.

"Why isn't the game doing other stuff meanwhile?" you might ask, and part of the answer is surely, "Why do stuff faster than the player can see?", while another part is likely a need to syncronize the simulation and the rendering so it doesn't show you some half-finished state, and a third part might be that it would be very confusing for the player to decouple the game state from what they see on screen, like you see yourself aiming at the monster, but actually it moved in between frames so your shot will miss even if the crosshair is dead on.

[-] victorz@lemmy.world 1 points 1 day ago

I didn't mean the game loop being bottlenecked, I said frame generation. But yeah, all that is also true, I know. 👍

[-] kattfisk@lemmy.dbzer0.com 2 points 1 day ago

I was trying to explain why the game loop would be held back by the rendering speed, even though they run on different hardware.

If you are bottlenecked by the GPU that means the game loop spends some of its time waiting for the GPU. If you then turn on frame generation, you devote parts of the GPU to doing that, which makes regular rendering slower, making the game loop spend even more time waiting. This will increase input latency.

Frame generation also needs to delay output of any real frame while it creates and inserts a generated frame. This will add some output latency as well.

In the opposite scenario, where you are bottlenecked by the CPU, enabling frame generation should in theory not impact the game loop at all. In that case it's the GPU that's waiting for the CPU, and it can use some of those extra resources it has to do frame generation with no impact on input latency.

[-] DarkAri 1 points 2 days ago

Maybe it's not the CPU but with FSR either way the real frame rate drops which is why you get input lag. The game logic/game loop is only calculated per real frame. Which means if you take a 20% drop in real frame rate you are going to get 20% more input lag.

[-] victorz@lemmy.world 2 points 1 day ago

Isn't FSR supposed to give better performance/real fps?

[-] DarkAri 1 points 1 day ago* (last edited 1 day ago)

If you get higher real FPS via upscaling, and a lower resolution. It can improve input latency and sim speed.

[-] WereCat@lemmy.world 1 points 1 day ago

It is (if we talk about FSR as upscaler tech). But it wont help in CPU bound scenarios where the GPU already has to wait for CPU.

[-] victorz@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

And in that case, frame generation will cause even more input lag I guess. I'm thinking frame generation is best for systems where the GPU is the bottle neck, I guess, input lag wise? Where it gets served frames ASAP, and can interpolate frames as soon as the latest, and second latest frames are available. Which it does anyway, but it has to wait less, then.

[-] WereCat@lemmy.world 1 points 2 days ago

it’s not. The whole point of FG was to take advantage of high refresh rate monitors as most games can’t render 500FPS even on the fastest CPU… alas, here we are with games requiring FG to get you to 60FPS on most systems looks at Borderlands 4 and Monster Hunter Wilds

[-] victorz@lemmy.world 1 points 2 days ago

Right, but FG shouldn't be touching the CPU in any way, should it? It should be a local thing on the GPU transparent to the CPU, unless I'm misunderstanding how it works.

[-] WereCat@lemmy.world 1 points 2 days ago
[-] victorz@lemmy.world 1 points 2 days ago

Then I don't understand how it would affect the game loop negatively. I'll look into it though, will do some research.

[-] WereCat@lemmy.world 2 points 2 days ago

Because the "delayed" or real input does not correspond to the image you see on the screen. That's why FG is most useful when you already have high base framerate as the input gets significantly lower and the discrepancy between the felt input and perceived image narrows.

Example:

30FPS is 33.3ms frame to frame latency (+ something extra from mouse to displayed image for input)

With 2x FG you get at most 60FPS assuming there's no performance penalty for FG. So you see 16.6ms + mouse to display frame to frame but input remains 33.3ms + mouse to display.

Same from base 60FPS 16.6ms to FG 120FPS 8.3ms perceived but 16.6ms+

Same from 120FPS 8.3ms base to FG 240FPS 4.15ms perceived...

As you can see the difference in input gets smaller and smaller between base FPS and FG FPS as you're increasing the base framerate.

This is however a perfect scenario that does not represent real world cases. Usually your base FPS fluctuates due to CPU and GPU intensive scenes. And during those flucfuations you will get big inpuy delay spikes that can be felt a lot as they suddenly widen the range between perceived image and real input... Couple that with the fact that FG almost always has a performance penalty as it puts more strain on the GPU so your base framerate and therefore input will be automatically higher.

[-] WereCat@lemmy.world 1 points 3 days ago

It's kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.

[-] victorz@lemmy.world 2 points 3 days ago

It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.

[-] DarkAri 2 points 3 days ago

The frame rate isnt going from 30 to 120 FPS. It's actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)

[-] victorz@lemmy.world 1 points 2 days ago

Not sure we have the same definition of frames here.

[-] DarkAri 2 points 2 days ago

Generated frames are created using a neural network, they have nothing to do with the actual game scripts and game loop and input polling and stuff. FSR does generate frames to interpolate between real frames but things like physics and input are not being generated as well. It's only visual. I guess maybe you have to have some basic knowledge about how a computer program and game engine works to understand this.

Basically the CPU steps through the simulation in steps. When you use frame gen, if it lowers the actual frame rate, then the CPU is making less loops per second over everything, like the physics updates, input polling(capturing key presses and mouse events), and other stuff like this.

[-] victorz@lemmy.world 2 points 1 day ago

Oh yeah, now I remember why there's more input lag with frame interpolation turned on. Taking a shot right now and now it pops into my head.

Anyway, it's because while the frame interpolation adds more frames per second, the "I-frames"—or real frames—you're seeing are lagging behind one I-frame. This is because it can't start showing you interpolated frames until it has two frames it can interpolate between.

So you won't start seeing I-frame N-1 until I-frame N (the latest I-frame) has been generated, thus creating extra input lag.

Someone correct me if I'm wrong, I'm supposed to be asleep...

[-] DarkAri 1 points 1 day ago

It's more so that the actual FPS is lower when using FSR in many cases. The GPU frame rate doesn't matter in terms of input lag and stuff, it's all about how many time the CPU can loop through the game logic per second.

So basically when you move 10 steps forward in a game, the CPU is running tons of code that take the time elapsed since the previous frame and interpolates where the player should be this frame. This is Delta time, (change in time between this frame and last) it's multiplied by stuff moving to give fluid movement with a variable frame rate. This is why older games would slow down if the frame rate dropped and new games will still calculate the passage of time correctly, even if you only have 15 FPS.

The fake frames have nothing to do with the game engine or logic, they are deep faked frames that are created with a neural network to fill in between real frames. This does give you something very close to extra frames on the GPU, but there is often a performance hit on the real frames since it's a heavy process. The CPU has to stay synced to the GPUs real frames since some logic is CPU bound, like physics, creating certain buffers, all kinds of stuff. If the real frame rate of the GPU is lower, it bottlenecks the CPU since it's also involved to a smaller degree, in rendering real frames. (Preparing data, sending it to the GPU, certain operations which are faster on the CPU that involve rendering like maybe using MMX or other CPU extensions.

So basically the less real frames you have, the longer the wait between when you game engine can detect mouse and keyboard events and update the game world, even if you are getting 2-3 times the frame rate with generated frames.

[-] WereCat@lemmy.world 1 points 3 days ago

yes, that’s why FPS in this case is not a good measure of performance

[-] victorz@lemmy.world 2 points 3 days ago

Very much so. The very reason why we want more fps is to have less input lag, that's my personal take anyway. That's the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).

this post was submitted on 28 Sep 2025
44 points (100.0% liked)

Linux Gaming

21425 readers
168 users here now

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

No memes/shitposts/low-effort posts, please.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

founded 2 years ago
MODERATORS