213
top 50 comments
sorted by: hot top controversial new old
[-] kingshrubb@lemmy.ml 17 points 3 days ago
[-] umbrella@lemmy.ml 11 points 3 days ago

amd exists, people.

please put your money where your mouth is please?

[-] stevedice@sh.itjust.works 5 points 3 days ago

Not at the higher end, it doesn't.

[-] omarfw@lemmy.world 2 points 2 days ago

Very few people actually need or more make use of the power that nvidia's high end cards provide

[-] umbrella@lemmy.ml 3 points 3 days ago

dunno about this generation cause im not in the market for new parts, but amd usually comes within spitting distance at much lower prices.

it makes less and less sense to buy nvidia as time passes, yet people still do.

[-] stevedice@sh.itjust.works 1 points 1 day ago

Well, that's just not true. I've been using AMD exclusively for 10 years now and they haven't had a proper competitor to NVIDIA's high end in since then. I'm willing to settle for less performance to avoid a shitty company but some people aren't. It can't make "less and less sense" to go with NVIDIA if they're the only ones making the product you want to buy. More importantly, though: it doesn't matter. The point is that people wanting performance above a 4080 are screwed because NVIDIA shat the bed with the 5000 series and AMD just doesn't exist at that price point so the comment I'm replying to makes no sense.

[-] umbrella@lemmy.ml 1 points 1 day ago* (last edited 23 hours ago)

The crushing majority of people are not spending $3000 on a graphics card. Chasing the fastest for the sake of it is something most people are not doing.

All other pricepoints are covered, and slightly better served on average at that by AMD.

[-] stevedice@sh.itjust.works 1 points 20 hours ago* (last edited 20 hours ago)

Welp. That's another lie. Until very recently, if you wanted performance over NVIDIA's 60-tier on AMD, your only options were the Vega 56, 64 or the Radeon VII, which were all trash. It wasn't until the 6000 series that AMD was able to come close to NVIDIA's 80-tier and they've come out and said they're not doing that anymore, so we have a grand total of TWO high end Radeon cards. Your assertion that AMD has covered every price point below $3000 is pure fantasy.

[-] umbrella@lemmy.ml 1 points 19 hours ago

you took a specific period where, yes, AMD was struggling.

my assertion is correct because we are not at that time anymore.

[-] stevedice@sh.itjust.works 1 points 19 hours ago

Did you miss the part where I pointed out AMD said they were gonna stop trying with high end? We were barely out of it for 2 generations and now we're right back into it.

[-] umbrella@lemmy.ml 1 points 18 hours ago

I explicitly said they don't have the most powerful GPU.

[-] stevedice@sh.itjust.works 1 points 14 hours ago

They don't have any powerful GPU. Stop it with the denial, brother.

[-] umbrella@lemmy.ml 1 points 13 hours ago* (last edited 13 hours ago)

wtf are you on about, they usually can beat all but the biggest one.

[-] stevedice@sh.itjust.works 1 points 10 hours ago

That's not true. Again, they have had 2 good cards in the last 10 years.

[-] umbrella@lemmy.ml 1 points 3 hours ago* (last edited 2 hours ago)

sure, overpay nvidia then, what can i say.

[-] stevedice@sh.itjust.works 1 points 2 hours ago

Literally anything would be better than plugging your ears and going "lalalala" which is what you're doing right now.

[-] umbrella@lemmy.ml 1 points 20 minutes ago
[-] endeavor@sopuli.xyz 3 points 3 days ago

Have you seen nvidia 5 series? AMD is accidentally higher end now.

[-] stevedice@sh.itjust.works 1 points 1 day ago

Yeah, they suck, but they at least exist, AMD flat out said they're not even gonna try anymore.

[-] circuitfarmer@lemmy.sdf.org 70 points 5 days ago

Vote with your wallets. DLSS and Ray Tracing aren't worth it to support this garbage.

[-] inclementimmigrant@lemmy.world 20 points 5 days ago* (last edited 4 days ago)

Wish more gamers would but that ship has long sailed unfortunately. I mean look at what the majority of gamers tolerate now.

load more comments (1 replies)
load more comments (6 replies)
[-] ZeDoTelhado@lemmy.world 20 points 4 days ago

For the people looking to upgrade: always check first the used market in your area. It is quite obvious for now the best thing to do is just try to get 40 series from the drones that must have the 50 series

[-] ogeist@lemmy.world 39 points 5 days ago

I've got the feeling that GPU development is plateauing, new flagships are consuming an immense amount of power and the sizes are humongous. I do give DLSS, Local-AI and similar technologies the benefit of doubt but is just not there yet. GPUs should be more efficient and improve in other ways.

[-] PM_Your_Nudes_Please@lemmy.world 32 points 4 days ago

I’ve said for a while that AMD will eventually eclipse all of the competition, simply because their design methodology is so different compared to the others. Intel has historically relied on simply cramming more into the same space. But they’re reaching theoretical limits on how small their designs can be; They’re being limited by things like atom size and the speed of light across the distance of the chip. But AMD has historically used the same dies for as long as possible, and relied on improving their efficiency to get gains instead. They were historically a generation (or even two) behind Intel in terms of pure hardware power, but still managed to compete because they used the chips more efficiently. As AMD also begins to approach those theoretical limits, I think they’ll do a much better job of actually eking out more computing power.

And the same goes for GPUs. With Nvidia recently resorting to the “just make it bigger and give it more power” design philosophy, it likely means they’re also reaching theoretical limitations.

[-] Redredme@lemmy.world 13 points 4 days ago

AMD never used chips "more efficiently". They hit gold with the RYZEN design but everything before since Athlon was horrible and more useful as a room heater. And before athlon it was even worse. The k6/k6-2 where funny little buggers extending the life of ziff7 but it lacked a lot of features and dont get me started about their dx4/5 stuff which frequently died in spectacular manners.

Ryzen works because of chiplets and the stacking of the cache. Add some very clever stuff in the pipeline which I don't presume to understand and the magic is complete. AMD is beating intel at it's own game: it's Ticks and tocks are way better and most important : executable. And that is something Intel hasn't been able to really do for several years. It only now seems to be returning.

And lets not forget the usb problems with ryzen 2/3 and the memory compatibility woes of ryzen's past and some say: present. Ryzen is good but its not "clean".

In GPU design AMD clearly does the same but executes worse then nvidia. 9070 cant even match its own predecessor, 7900xtx is again a room heater and is anything but efficient. And lets not talk about what came before. 6xxx series where good enough but troublesome for some and radeon 7 was a complete a shitfest.

Now, with 90 70 AMD once again, for the umpteenth time, promises that the generation after will fix all its woes. That that can compete with Nvidia.

Trouble is, they've been saying that for over a decade.

Intel is the one looking at GPU design differently. The only question is: will they continue or axe the division now gelsinger is gone. which would be monunentally stupid but if we can count on 1 thing then its the horrible shortsightness of corporate America. Especially when wall street is involved. And with intel, wall street is heavily involved. Vultures are circling.

[-] endeavor@sopuli.xyz 3 points 3 days ago

And then you pick up a steam deck and play games that were originally meant to play on cards the size of steam deck.

[-] the_q@lemm.ee 7 points 3 days ago

Or you know buy an AMD card and quit giving your money to the objectively worse company.

[-] cyberpunk007@lemmy.ca 19 points 4 days ago

lol this reminds me of whatever that card was back in the 2000's or so, where you could literally make a trace with a pencil to upgrade the version lower to the version higher.

[-] christian@lemmy.ml 1 points 1 day ago

I barely remember this anymore but the downgrade had certain things deactivated. Something like my card had four "pipelines" and the high-end one had eight, so a minor hardware modification could reactivate them. It was risky though, because often imperfections came out of the manufacturing process, and then they would just deactivate the problem areas and turn it into a lower-end version.

After a little while, someone put out drivers that could simulate the modification without physically touching the card. You'd read about softmod and hardmod for the lower-end radeon cards.

I used the softmod and 90% of the time it worked perfectly, but there was definitely an issue where some textures in certain games would have weird artifacting in a checkerboard pattern. If I disabled the softmod the artifacting wouldn't happen.

[-] inclementimmigrant@lemmy.world 11 points 4 days ago

Yeah, those were the days when cost control was simply to use the same PCB but with just the traces left out. There were also quite a few cards that used the exact same PCB, traces intact, that you could simple flash the next tier card's BIOS and get significant performance bumps.

Did a few of those mods myself back in the day, those were fun times.

[-] cyberpunk007@lemmy.ca 4 points 4 days ago

Ok now how do I turn my 2070s into a 5090? 😅

[-] CheeseNoodle@lemmy.world 4 points 3 days ago* (last edited 3 days ago)

Get 500 dollars then use AI to generate the other 3/4 of the money and buy a 5090.

[-] inclementimmigrant@lemmy.world 3 points 4 days ago* (last edited 4 days ago)

Well of you ask Nvidia it's now just the driver making frames for you.

[-] cyberpunk007@lemmy.ca 3 points 4 days ago

Ya I'm sad to see and exit. Maybe in a year or two I'll get that sapphire 7900 xtx or whatever it is.

load more comments (4 replies)
[-] root@aussie.zone 4 points 3 days ago

I remember using the pencil trick on my old AMD Duron processor to unlock it. 🤣

load more comments (2 replies)
[-] MyOpinion@lemm.ee 36 points 5 days ago

Nvidia is just strait up conning people.

[-] Majorllama@lemmy.world 21 points 4 days ago

Just like I rode my 1080ti for a long time it looks like I'll be running my 3080 for awhile lol.

I hope in a few years when I'm actually ready to upgrade that the GPU market isn't so dire... All signs are pointing to no unfortunately.

[-] OminousOrange@lemmy.ca 10 points 4 days ago

I'm still riding my 1080ti...

load more comments (4 replies)
[-] Cavemanfreak@lemm.ee 6 points 4 days ago

1060 6GB gang here... I will probably get a 3060 or 4060 next time I uppgrade, unless I ditch Nvidia (thinking of moving to Linux).

load more comments (2 replies)
[-] stevedice@sh.itjust.works 2 points 3 days ago

Just like I rode my 1080ti for a long time

You only skipped a generation. What are you talking about?

[-] Majorllama@lemmy.world 2 points 3 days ago

I had the 1080ti well after the 3080 release. I Got a great deal and had recent switched to 1440p so I pulled the trigger on a 3080 not long before the 4000 series cards dropped.

load more comments (6 replies)
[-] Viri4thus@feddit.org 18 points 5 days ago

Where's the antitrust regulation?

[-] Sanctus@lemmy.world 36 points 5 days ago

Trump's America has none. But the FCC is suing public broadcast services. So thats what we get.

load more comments (5 replies)
load more comments
view more: next ›
this post was submitted on 31 Jan 2025
213 points (100.0% liked)

PC Gaming

9158 readers
567 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS