this post was submitted on 04 Feb 2025
98 points (100.0% liked)

PC Gaming

10124 readers
450 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
all 31 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 1 month ago (2 children)

Okay so the “S” models stood for “Super” which was a slight step up from the base. What are “D” models? “Duper”?

[–] [email protected] 6 points 1 month ago* (last edited 1 month ago)

China exclusive models, labeled as such to get around regulations.

[–] [email protected] 3 points 1 month ago
[–] [email protected] 9 points 1 month ago

Guess we'll have to see how they handle this. Are they going to be good and do a full recall, or pull an Intel and do everything they can to avoid it?

[–] [email protected] 9 points 1 month ago (3 children)

It feels like things are so powerful and complex that failure rates of all these devices is much higher now.

[–] [email protected] 1 points 1 month ago

Yup. This nonsense (pcie-5 burnout) should have been detected immediately during quality control.

[–] [email protected] 2 points 1 month ago

You are just short of needing a personal sized nuclear reactor to power these damn things, so I mean the logic follows that the failure rate is going to climb

[–] [email protected] 10 points 1 month ago (2 children)

I don't have any stats to back this up, but I wouldn't be surprised if failure rates were higher back in the 90s and 2000s.

We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

Would be interesting to actually analyze the real world dynamics around this.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago) (3 children)

Not very many people had a dedicated GPU in the 90s and 2000s. And there's no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

[–] [email protected] 2 points 1 month ago

Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

[–] [email protected] 1 points 1 month ago

I was referring to PC components in general.

[–] [email protected] 2 points 1 month ago

We all did they used to cost like 60 bucks

[–] [email protected] 1 points 1 month ago

I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.

But I have no idea..

[–] [email protected] 16 points 1 month ago (1 children)

I'm sure replacement units are in plentiful supply. Right?

[–] [email protected] 4 points 1 month ago

Trust me bro!

[–] [email protected] 0 points 1 month ago
[–] [email protected] 59 points 1 month ago (3 children)

Imagine paying $2K for a GPU and this shit happens

[–] [email protected] 0 points 1 month ago

Closer to 2500 for most models

[–] [email protected] 25 points 1 month ago (1 children)

Scalpers have turned that into $6000 for the available units left.

[–] [email protected] 11 points 1 month ago

The stupidly minor marginal gains you'd get from one of these cards vs a four series, isn't even worth the time it would take to crack your case, let alone 6 grand. Worlds lost it's god damn mind.

[–] [email protected] 8 points 1 month ago

I doubt they got them for 2k

[–] [email protected] 45 points 1 month ago (1 children)

nvidia has been a garbage company for a few generations now. They got to the top, and sat up there enshitifying everything because they have a monopoly on the market. Don't buy into this shit.

[–] [email protected] 18 points 1 month ago (4 children)

If you got hardware, use it until it dies...

Fucking 1070 can still put out decent performance lol

[–] [email protected] 3 points 1 month ago

Running a 1060 in my desktop. Still does absolutely fine. I got my buddy's old 1080ti OCd, just waiting to get the water-cooling kit put together and the 1060 will get put in my media sever to take over transcoding for the old 980.

[–] [email protected] 4 points 1 month ago

I'm still using a VIC-II without any lag or drop in FPS

[–] [email protected] 4 points 1 month ago (1 children)

2070s here, doesn't work that well with AAA (or even AAAA games! /s) that well with 3440x1440 resolution :S. But I can easily survive turning the graphics down.

[–] [email protected] 7 points 1 month ago (1 children)

3440x1440

Suffering from Success 🐸

[–] [email protected] 3 points 1 month ago (1 children)

Haha not so successful when your fps is total garbage

[–] [email protected] 1 points 1 month ago

temporarily embarrassed millionaire vibes

[–] [email protected] 3 points 1 month ago

For real. I've been rocking a 1070 gor years and the only games that don't get decent performance are new release open world survival sandbox titles that tend to suffer from a lack of optimization anyway.