185

So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won't render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.

The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn't there while the planet/moon remains illuminated without any light source.

Original Reddit post by u/Yoraxx

all 39 comments
sorted by: hot top controversial new old
[-] BlemboTheThird@lemmy.ca 53 points 2 years ago

And here I was just reading that AMD GPUs showed much better performance in Starfield. Maybe it's because they're just not rendering stuff at all lmao

[-] DmMacniel@feddit.de 49 points 2 years ago

Waaaaait... it was a bug and not gross incompetence?

[-] e-ratic@kbin.social 28 points 2 years ago* (last edited 2 years ago)

"Bethesda's Bug", when you can't tell if something isn't working correctly or if it's just not implemented at all.

[-] geosoco@kbin.social 20 points 2 years ago

I don't think we know.

Makes me wonder of the dev team is on a much-needed vacation or if they only run nvidia gpus. lol

[-] Hildegarde@lemmy.world 12 points 2 years ago

The game runs better on AMD, and Bethesda partnered with AMD in some way for this PC release.

[-] violetraven 16 points 2 years ago

Does it run better by not rendering light emitting objects?

[-] Hildegarde@lemmy.world 8 points 2 years ago

That's one way to improve performance

[-] Frog-Brawler@kbin.social 3 points 2 years ago

Perhaps. Who needs stars anyway?

[-] booly@sh.itjust.works 3 points 2 years ago

All GPUs perform equally well the same at ray tracing when there are no rays to trace

[-] geosoco@kbin.social 9 points 2 years ago

That really just means AMD gave them a lot of money, and they just made sure FSR2 worked. lol

[-] Naz@sh.itjust.works 3 points 2 years ago* (last edited 2 years ago)

I've got a 7900XTX Ultra, and FSR2 does literally nothing, which is hilarious.

100% resolution scale, 128 FPS.

75% resolution scale .. 128 FPS.

50% resolution scale, looking like underwater potatoes ... 128 FPS.

I don't know how it's possible to make an engine this way, it seems CPU-bound and I'm lucky that I upgraded my CPU not too long ago, I'm outperforming my friend who has an RTX 4090 in literally all scenes, indoor, ship, and outdoor/planet.

He struggles to break 70 FPS on 1080p Ultra, meanwhile I'm doing 4K Ultra.

[-] redcalcium@lemmy.institute 4 points 2 years ago

Creation Engine has always been cpu-bound since gamebryo era.

[-] Xperr7@kbin.social 2 points 2 years ago

I have noticed it's better anti-aliasing than the forced TAA (once I forced it off)

[-] geosoco@kbin.social 1 points 2 years ago* (last edited 2 years ago)

Some of the benchmarks definitely pointed out that it was CPU bound in many areas (eg. the cities).

I think the HUB one mentioned that some of the forested planets were much more GPU bound and better for testing.

I'm on a tv so capped at 60fps, but I do see a power usage difference with FSR - 75% vs FSR- 100% that's pretty substantial on my 7900xt.

[-] AnUnusualRelic@lemmy.world 3 points 2 years ago

#include "fsr2.h"

Ok, can we have the monies please?

[-] Hexarei@programming.dev 15 points 2 years ago

It can be both

[-] hoshikarakitaridia@sh.itjust.works 10 points 2 years ago

If it's down to very specific Chipsets, that sounds like an unforseeable bug.

[-] hoshikarakitaridia@sh.itjust.works 2 points 2 years ago* (last edited 2 years ago)

Correction: someone pointed out they are literally interfacing the graphics drivers the wrong way, so it's still on the their Devs.

[-] MooseLad@lemmy.world 5 points 2 years ago* (last edited 2 years ago)

I had no idea it was a problem on Radeon GPUs. I saw a few people complaining about not seeing the stars, but I didn't have a clue what they were talking about since it was always fine for my Nvidia card.

[-] darkeox@kbin.social 17 points 2 years ago

Can confirm it's the same on Proton / Linux. This game keeps being a joke on the technical side.

[-] ChrisLicht@lemm.ee 12 points 2 years ago

So fitting that this is posted in this Lemmy instance.

[-] Thebazilly@ttrpg.network 12 points 2 years ago

Now it is just Field.

[-] Pxtl@lemmy.ca 8 points 2 years ago

Ugh. A part of me wants to give AMD a chance for my next upgrade and push back against Nvidia's near-monopoly of GPUs but I really don't want to deal with how everything kinda-sorta works on Radeons.

[-] ruckblack@sh.itjust.works 34 points 2 years ago

I've exclusively been on AMD since like 2015 and my GPUs "kinda-sorta working" has not been my experience at all lol. Literally have never had brand-specific problems. The only brand-specific issues I've had were trying to get my laptop with an Nvidia GPU to work properly under Linux.

[-] Sharkwellington@lemmy.one 9 points 2 years ago* (last edited 2 years ago)

I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that's why more of these brand-specific coding errors slip through for them. It's unfortunate but I can't deny I've seen some weird bugs in my time.

[-] darkeox@kbin.social 6 points 2 years ago

How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn't embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?

This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn't work for this game.

[-] Pxtl@lemmy.ca 4 points 2 years ago

Oh of course. I don't actually blame AMD for those kinds of bugs. But it's the reality as a user, at least in my experience... but it's been like stupid long time since I've used a machine with an AMD card.

[-] Indicah@sh.itjust.works 3 points 2 years ago

Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let's not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.

[-] Squirrel@thelemmy.club 1 points 2 years ago

And, being Bethesda, it's not like bugs are unexpected.

[-] JJROKCZ@lemmy.world 9 points 2 years ago

I’ve been red only in my rig for over a decade and the only problems I’ve had are that I play the same games as everyone else perfectly fine and I have more money in my wallet due to not spending as much on parts. That and the bulldozer generation CPUs heated my house like crazy, there’s no denying that lol

[-] XTornado@lemmy.ml 3 points 2 years ago* (last edited 2 years ago)

Ugh... the last part is still happening? Like are the new CPUs also so hot or whatever would somebody call it?

I am tempted to build a new PC all AMD for costs alone although the AM4 probably won't last as long as the Am3 did sadly. But the summer is already terrible with my Intel.... no need for more heat.

[-] JJROKCZ@lemmy.world 7 points 2 years ago

No bulldozer chips have been gone for like 6-7 years. They last two ryzen generations have been far more energy/heat efficient than intel. Ryzen is the better choice by far right now

[-] ninjan@lemmy.mildgrim.com 4 points 2 years ago

Current Intel is worse than current AMD for CPU heat and Nvidia is currently cooler than AMD on GPU. Also we're on AM5. AM4 lived for a relatively long time, no indication that AM5 won't be a long runner as well. Intel changes socket more often as well so for longevity AMD is almost always the best, except at the tail end of a socket.

[-] Resolved3874@lemdro.id 1 points 2 years ago

Huh. Didn't even know they replaced am4 until this comment 😂 my am4 ryzen 5 paired with an rx6700xt still does everything I want it to do. And if it starts slacking I have plenty of upgrading left to do.

[-] Frog-Brawler@kbin.social 8 points 2 years ago

I’ve exclusively used AMD GPU’s since building my first PC 27 years ago. I’m not aware of things “kinda-sorta” not working.

[-] vikingtons@lemmy.world 6 points 2 years ago

This issue also occurs on Intel Arc Alchemist and Nvidia Maxwell

[-] dudewitbow@lemmy.ml 6 points 2 years ago

You make it sound like nvidia has never pushed out a kinda sorta works driver.

[-] 8ender@lemmy.world 5 points 2 years ago

Funny I noticed this on New Atlantis and just chalked it up to the devs being lazy

this post was submitted on 10 Sep 2023
185 points (100.0% liked)

Games

18651 readers
244 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS