149
submitted 11 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
top 50 comments
sorted by: hot top controversial new old
[-] Norgur@kbin.social 115 points 11 months ago

Thing is: there is always the "next better thing" around the corner. That's what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

[-] Sigmatics@lemmy.ca 85 points 11 months ago

Exactly. The best time to buy a graphics card is never

[-] wrath_of_grunge@kbin.social 22 points 11 months ago

really my rule of thumb has always been when it's a significant upgrade.

for a long time i didn't really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i'm a bit more opportunistic in my upgrades. but i still seek out 'meaningful' upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

[-] schmidtster@lemmy.world 12 points 11 months ago* (last edited 11 months ago)

4x…? Even in older cards that’s more than a decade between cards.

A 4080 is only 2.5x as powerful as a 1080ti, those are 5 years apart.

[-] Sigmatics@lemmy.ca 10 points 11 months ago* (last edited 11 months ago)

What's wrong with upgrading once every 5-10 years? Not everyone plays the latest games on 4k Ultra

Admittedly 4x is a bit steep, more like 3-4x

[-] schmidtster@lemmy.world 3 points 11 months ago* (last edited 11 months ago)

Starfield requires a minimum 1070ti to play. It’s not just about fidelity, you just wouldn’t be able to play any newer games.

load more comments (1 replies)
[-] jmcs@discuss.tchncs.de 11 points 11 months ago* (last edited 11 months ago)

It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.

[-] miketunes@lemmy.world 4 points 11 months ago

Yup just picked up a whole PC with rtx3090 for $800.

load more comments (5 replies)
[-] massive_bereavement@kbin.social 10 points 11 months ago

Graphics card. Not even once.

load more comments (1 replies)
[-] hydroel@lemmy.world 11 points 11 months ago

Yeah it's always that: "I want to buy the new shiny thing! But it's expensive, so I'll wait for a while for its price to come down." You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

[-] Norgur@kbin.social 4 points 11 months ago

Yep. There will always be "just wait N months and there will be the bestest thing that beats the old bestest thing". You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.

[-] nik282000@lemmy.ca 7 points 11 months ago

I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

[-] Norgur@kbin.social 9 points 11 months ago

Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over... Thing is: you card didn't get any worse. You thought the card was a good value proposition for you when you bought it and it hasn't lost any of that.

[-] alessandro@lemmy.ca 3 points 11 months ago

choose the best available option

"The" point. Which is the best available option?

The simplest answer would be "price per fps".

[-] Norgur@kbin.social 6 points 11 months ago

Not always. I'm doing a lot of rendering and such. So FPS aren't my primary concern.

[-] the_q@lemmy.world 20 points 11 months ago
[-] zoe@jlai.lu 4 points 11 months ago* (last edited 11 months ago)

just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow ..

[-] sederx@programming.dev 19 points 11 months ago

i saw a 4080 on amazon for 1200, shits crazy

[-] gnuplusmatt@reddthat.com 18 points 11 months ago* (last edited 11 months ago)

As a Linux gamer, this really wasn't on the cards anyway

[-] BCsven@lemmy.ca 4 points 11 months ago

AMD is a better decision, but my nVidia works great with Linux, but I'm on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

[-] gnuplusmatt@reddthat.com 3 points 11 months ago

I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I've been on Wayland since Fedora 35.

[-] CeeBee@lemmy.world 3 points 11 months ago

A lot has changed in a decade.

load more comments (1 replies)
load more comments (4 replies)
[-] Schmuppes@lemmy.world 17 points 11 months ago

Major refresh means what nowadays? 7 instead of 4 percent gains compared to the previous generation?

[-] NOT_RICK@lemmy.world 8 points 11 months ago

The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

load more comments (2 replies)
[-] massive_bereavement@kbin.social 6 points 11 months ago* (last edited 11 months ago)

For anything ML related, having the additional memory is worth the investment, as it allows for larger models.

That said, at these prices it raises the question if it is more sensible to just throw money at GCP or AWS for their GPU node time.

[-] GarytheSnail@programming.dev 17 points 11 months ago

All three cards are rumored to come with the same memory configuration as their base models...

Sigh.

[-] Fungah@lemmy.world 8 points 11 months ago

Give us more fucking vram you dicks.

load more comments (1 replies)
[-] RizzRustbolt@lemmy.world 12 points 11 months ago

freezes

stands there with my credit card in my hand while the cashier stares at me awkwardly

[-] Kit 8 points 11 months ago

Meh I'm still gonna buy a 4070 Ti on Black Friday. Wish I could wait but my other half wants a PC for Christmas.

[-] joneskind@beehaw.org 8 points 11 months ago

It really is a risky bet to make.

I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

SUPER upgrades never crossed the +10%

I’d rather wait for the Ti version

[-] wrath_of_grunge@kbin.social 3 points 11 months ago

really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that's a card you'll see with some decent longevity, even if it's not being recognized as such currently.

load more comments (1 replies)
load more comments (1 replies)
[-] excel@lemmy.megumin.org 7 points 11 months ago

It would help if they had any competitors. AMD and Intel aren’t cutting it.

[-] Litany@lemmy.world 20 points 11 months ago
[-] Bratwurstboy@iusearchlinux.fyi 12 points 11 months ago

Pretty damn happy with my 7900XTX too.

[-] skizzles@lemmy.ml 5 points 11 months ago

Swapped over to a 7800XT about 3 months ago. Better Linux performance, tested a bit on Windows also and it worked fine, I'm more than satisfied with my decision to hop over from my 3060.

load more comments (2 replies)
[-] Cqrd@lemmy.dbzer0.com 11 points 11 months ago

AMD is definitely pulling their load, but more competitors are always better.

[-] CaptPretentious@lemmy.world 9 points 11 months ago

Intel is definitely catching up.

[-] UnspecificGravity@discuss.tchncs.de 9 points 11 months ago

For the vast majority of customers that aren't looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.

[-] Fridgeratr@lemmy.world 4 points 11 months ago

AMD is absolutely cutting it!! They may not get DLSS or ray trace as well but their cards still kick ass

[-] state_electrician@discuss.tchncs.de 3 points 11 months ago

Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

[-] baconisaveg@lemmy.ca 6 points 11 months ago

A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I've seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

[-] AnotherDirtyAnglo@lemmy.ca 3 points 11 months ago

Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

[-] dellish@lemmy.world 2 points 11 months ago

Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver "updates" that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

[-] vivadanang@lemm.ee 3 points 11 months ago

have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

in a laptop? practically none. there are some very rare 'laptops' out there - really chonk tops - that have full size desktop gpu's inside them. the vast majority, on the other hand, will have 'mobile' versions of these gpus that are basically permanently connected to the laptop's motherboard (if not being on the mobo itself).

one example of a laptop with a full-size gpu (legacy, these aren't sold anymore): https://www.titancomputers.com/Titan-M151-GPU-Computing-Laptop-workstation-p/m151.htm note the THICK chassis - that's what you need to hold a desktop gpu.

load more comments (1 replies)
load more comments (2 replies)
load more comments
view more: next ›
this post was submitted on 17 Nov 2023
149 points (100.0% liked)

PC Gaming

8401 readers
482 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS