60
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 25 Aug 2023
60 points (100.0% liked)
Games
16722 readers
500 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
We are reaching the limits of render technology with our current architectures. You'll find that most established practices for computer hardware/software/firmware started as a "cheat" or weird innovation that began with using something in an ass backwards way. Reducing the amount of data a GPU needs to render is a good way to get more out of old and new hardware. It's not perfected yet but the future of these features is very promising.
good thing the rendering engineers are willing to try different ways instead of stuck at this "real pixel" shit that some youtuber started. Even freaking Pixar that is grand daddy of CG tech also doing ML global illumination and temporal denoiser. some of our current gen realtime graphics literally took hours to render 10 years ago, hardware aren't improving that fast, it's the new algorithms and render method make it possible.