621
you are viewing a single comment's thread
view the rest of the comments
[-] bassomitron@lemmy.world 16 points 1 day ago* (last edited 1 day ago)

Out of curiosity, why do you refuse to support Nvidia? AMD isn't some saint, they're a shitty corporation just like Nvidia. They got lucky when Jim Keller saved their asses with the Ryzen architecture in the mid-2010s. They haven't really innovated a god damn thing since then and it shows.

Edit: I get it, I get it, Nvidia is a much shittier company and I agree. I was pretty drunk last night before bed, please pardon the shots fired

[-] domi@lemmy.secnd.me 47 points 1 day ago

Besides what was mentioned below, it's not about making competitive products but about Nvidia being an absolute asshole since the 2000s and they got even worse ever since the crypto and AI craze started. AMD and Nvidia are both corporations but they are not even playing the same game when it comes to being anti-competitive.

There's a reason why Wikipedia has a controversies section on Nvidia: https://en.m.wikipedia.org/wiki/Nvidia#Controversies

That list is far from exhaustive. There's so much more about Nvidia that you should remember vividly if you were a PC gamer in the 2000s and 2010s with an AMD GPU, like:

  • When they pushed developers to use an unecessary amount of tesselation because they knew tesselation performed worse on AMD
  • When they pushed their Gameworks framework which heavily gimped AMD GPUs
  • When they pushed their PhysX framework which automatically offloaded to CPU on AMD GPUs
  • When they disabled their GPUs in their driver when they detected an AMD GPU is also present in the system
  • When they were cheating in benchmarks by adding optimizations specific to those benchmarks
  • When they shipped an incomplete Vulkan implementation but claimed they are compliant

Nvidia has been gimping gaming performance and visuals since forever for both AMD GPUs and even their own customers and we haven't even gotten to DLSS and raytracing yet.

I refuse to buy anything Nvidia until they stop abusing their market position at every chance they get.

[-] amorpheus@lemmy.world 50 points 1 day ago

they're a shitty corporation just like Nvidia

Neither of them are anyone's friend, but claiming they're the same level of nasty is a bit of a stretch.

[-] Crashumbc@lemmy.world 1 points 1 day ago

Not saying that supporting the under dog isn't good.

Just don't think AMD is less "nasty", the only thing stopping them is the lack of power to do so.

[-] sugar_in_your_tea@sh.itjust.works 1 points 3 hours ago

Right, and since they're not dominant, they're less nasty. If they become dominant, consider switching to whoever is the underdog at that point.

[-] ElectroLisa@piefed.blahaj.zone 33 points 1 day ago

Not OC but I don't want to deal with Nvidia's proprietary drivers. AMD cards "just work" on Linux

[-] Redex68@lemmy.world 4 points 1 day ago

Except that AMD doesn't support HDMI 2.1 on Linux (not their fault to be fair, but still)

[-] i_am_hiding@aussie.zone 10 points 1 day ago

This may be an unpopular opinion but who cares? I'll use DVI if I have to.

[-] Redex68@lemmy.world 3 points 1 day ago

I personally don't have a need for it, but if someone has a 4K 120Hz TV or monitor without DisplayPort that they want to use as such, it's kinda stupid that they can't.

[-] WhyJiffie@sh.itjust.works 4 points 1 day ago

yeah, but that's the fault of the HDMI standards group. AMD cards could only support HDMI 2.1 if they closed their driver down. I guess this can't be fixed with a DP to HDMI adapter either, right?

my opinion: displayport is superior, and if I have a HDMI-only screen with supposed 4k 120Hz support I treat it as false info.

[-] naitro@lemmy.world 1 points 1 day ago

Is that the case on mobile APUs as well? I'm pretty sure my laptop with 7840u does 4k120hz

[-] Truscape 4 points 1 day ago

Intel cards do, I think, so that's a non-NVIDIA option.

[-] bluecat_OwO@lemmy.world 2 points 1 day ago

yeah intel (⁠⊙⁠_⁠◎⁠)

[-] bassomitron@lemmy.world 3 points 1 day ago

That's completely valid, I haven't had issues on Linux myself with nvidia, but I know it's definitely a thing for a lot of people.

[-] frezik 3 points 1 day ago

Haven't innovated? 3D chip stacking?

CPU companies generally don't change their micro-architecture, especially when it works.

[-] Arcane2077@sh.itjust.works 1 points 1 day ago

intel didn’t for 7 years, but they started and ended that trend.

this post was submitted on 26 Aug 2025
621 points (100.0% liked)

Technology

74545 readers
3529 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS