1280
submitted 5 months ago by Mubelotix@jlai.lu to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] micka190@lemmy.world 13 points 5 months ago* (last edited 5 months ago)

Edit: Here's another comment I made with links and more information on why this is going to be more common going forward. There's a very real and technical reason for using these new rendering strategies and it's why we'll start seeing more and more games require at least an RTX series card.


You're misunderstanding the issue. As much as "RTX OFF, RTX ON" is a meme, the RTX series of cards genuinely introduced improvements to rendering techniques that were previously impossible to pull-off with acceptable performance, and more and more games are making use of them.

Alan Wake 2 is a great example of this. The game runs like ass on 1080tis on low because the 1080ti is physically incapable of performing the kind of rendering instructions they're using without a massive performance hit. Meanwhile, the RTX 2000 series cards are perfectly capable of doing it. Digital Foundry's Alan Wake 2 review goes a bit more in depth about it, it's worth a watch.

If you aren't going to play anything that came out after 2023, you're probably going to be fine with a 1080ti, because it was a great card, but we're definitely hitting the point where technology is moving to different rendering standards that it doesn't handle as well.

[-] frezik@midwest.social 5 points 5 months ago

So here's two links about Alan Wake 2.

First, on a 1080ti: https://youtu.be/IShSQQxjoNk?si=E2NRiIxz54VAHStn

And then on a Rog Aly (which I picked because it's a little more powerful than the current Steam Deck, and runs native Windows): https://youtu.be/hMV4b605c2o?si=1ijy_RDUMKwXKQQH

The Rog seems to be doing a little better, but not by much. They're both hitting sub 30fps at 720p.

My point is that if that kind of handheld hardware becomes typical, combined with the economic problems of continuing to make highly detailed games, then Alan Wake 2 is going to be an abberation. The industry could easily pull back on that, and I welcome it. The push for higher and higher detail has not resulted in good games.

[-] SailorMoss@sh.itjust.works 3 points 5 months ago* (last edited 5 months ago)

I own a 1080ti and there was recently a massive update to Allan Wake 2 that made it more playable on pascal GPUs. Digital foundry did a video on it: http://youtu.be/t-3PkRbeO8A

I don’t know of any current game that can’t run at least 1080p30fps on 1080ti. But of course my knowledge is not exhaustive.

I wouldn’t expect every “next-gen” game to get the same treatment as Alan Wake 2 going forward. But we’re 4 years into the generation and there has probably been less than 10 games that were built to take full advantage of modern console hardware. My 1080ti has got a few more good years in it.

[-] PipedLinkBot@feddit.rocks 1 points 5 months ago

Here is an alternative Piped link(s):

http://piped.video/t-3PkRbeO8A

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] uis@lemm.ee 2 points 5 months ago

Can you reference those instructions more specifically

[-] micka190@lemmy.world 7 points 5 months ago

"Instructions" is probably the wrong word here (I was mostly trying to dumb it down for people who aren't familiar with graphics rendering terminology).

Here's a link to the Digital Foundry video I was talking about (didn't realized they made like 5 videos for Alan Wake 2, took a bit to find it).

The big thing, in Alan Wake 2's case, is that it uses Mesh Shaders. The video I linked above goes into it at around the 3:38 mark.

AMD has a pretty detailed article on how they work here.

This /r/GameDev post here has some devs explaining why it's useful in a more accessible manner.

The idea is that it allows offloading more work to the GPU in ways that are much better performance-wise. It just requires that the hardware actually support it, which is why you basically need an RTX card for Alan Wake 2 (or whichever AMD GPU supports Mesh Shaders, I'm not as familiar with their cards).

[-] uis@lemm.ee 1 points 5 months ago* (last edited 5 months ago)

Ah, mesh shaders. Cool stuff. AMD retroactively added them to their old GPUs in drivers. I think same goes for Intel's post-Ivybridge GPUs(I think send opcode can throw primitives into 3d pipeline, if you are interested, you can go read docs). I guess Nvidia can do something similar.

And even if they don't have such straightforward way of implementing them, they probably(my guess, can be wrong) can be emulated in geometry shaders.

What I don't like is apparent removal of vertex fetcher, but maybe there will be extension that will return it.

[-] micka190@lemmy.world 2 points 5 months ago

I could be wrong, but I'm pretty sure Nvidia has patched them into the GTX series, they're just really slow compared to RTX cards.

[-] PipedLinkBot@feddit.rocks 1 points 5 months ago

Here is an alternative Piped link(s):

Here's a link to the Digital Foundry video I was talking about

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] Crashumbc@lemmy.world 1 points 5 months ago

I suspect that's the exception and will be for most games.

this post was submitted on 12 Jun 2024
1280 points (100.0% liked)

Memes

45729 readers
785 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS