102
NVIDIA has removed “Hot Spot” sensor data from GeForce RTX 50 GPUs
(videocardz.com)
A community for PC Master Race.
Rules:
Notes:
Oh I'm sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that's what hotspot temp reporting helps with diagnosing.
Isn't a GPU that pulls 600 watts in the whackjob territory?
The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.
It's not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I've seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.