25
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Jan 2025
25 points (100.0% liked)
Hardware
3177 readers
3 users here now
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - !augmented_reality@lemmy.world
- Gaming Laptops - !gaminglaptops@lemmy.world
- Laptops - !laptops@lemmy.world
- Linux Hardware - !linuxhardware@programming.dev
- Mechanical Keyboards - !mechanical_keyboards@programming.dev
- Microcontrollers - !microcontrollers@lemux.minnix.dev
- Monitors - !monitors@piefed.social
- Raspberry Pi - !raspberry_pi@programming.dev
- Retro Computing - !retrocomputing@lemmy.sdf.org
- Single Board Computers - !sbcs@lemux.minnix.dev
- Virtual Reality - !virtualreality@lemmy.world
Icon by "icon lauk" under CC BY 3.0
founded 2 years ago
MODERATORS
It was rather obvious Nvidia was fudging performance results via framegen. Now let’s see those numbers verified independently and we’ll land in a typical generation to generation improvement probably. I wouldn’t be surprised if neural shaders demo was done by an actual demoscene coder because it looked like something someone could do in 64k demo competition and not like something that a developer could do at scale without a horde of artists.
I hope that things like input lag, frame persistence and frame time consistency become metrics we use to judge GPUs as soon as possible because current ones are no longer representative of hardware performance.