this post was submitted on 15 Jun 2025
2 points (100.0% liked)

PC Gaming

11635 readers
630 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
 

Well I am shocked, SHOCKED I say! Well, not that shocked.

all 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 2 weeks ago

GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.

It's been 2 years since and I don't regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but... you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.

But then there is that. The controller. Oh my lord it's so much more comfortable than even the best gaming mouse. I've done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:

Use gaming equipment for gaming and leave office equipment in the office.

[–] [email protected] 0 points 2 weeks ago

For me it's the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don't need a high powered graphics card for that. I've been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can't force fun through sheer hardware performance

[–] [email protected] 0 points 2 weeks ago (1 children)

IT litterally costs $3000

Thats almost 4 time the cost of my 3090.

[–] [email protected] 0 points 2 weeks ago

Thats almost a year of work on my country lol...

[–] [email protected] 0 points 2 weeks ago

I don't buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I'm assumimg series naming here) before upgrading.

[–] [email protected] 0 points 2 weeks ago (1 children)

Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

So my next card is probably gonna be an RX 9070XT.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they're playing their kid's console games.

Every year we say "Gonna look into upgrading" but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn't also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

[–] [email protected] 0 points 2 weeks ago

It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago)

I just looked up the price and I was "Yikes!". You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

[–] [email protected] 0 points 2 weeks ago (1 children)

I have a 4090. I don't see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I don't think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there's zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia's side make hardware irrelevant.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago)

One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they'd be swimming in market share if they used their own fabs instead (and kept the bigger die).

I feel like another is... marketing?

Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn't matter how competitive anything is anymore.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol

[–] [email protected] 0 points 2 weeks ago

I ran vr on one of those. Not well, but well enough.

[–] [email protected] 0 points 2 weeks ago (2 children)

Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.

My GPU which is an RTX2060 is getting a little long in the tooth and I'll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I'm at it.

[–] [email protected] 0 points 2 weeks ago

bitcoin mining

That's a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it's nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.

AI is driving up prices or rather, it's reducing availability, which then translates into higher prices.

Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They'll sell less volume but they'll make more profit per unit.

[–] [email protected] 0 points 2 weeks ago (1 children)

My kid got the 2060, I bought a RX 6400, I don't need the hairy arms any more.

Then again I have become old and grumpy, playing old games.

[–] [email protected] 0 points 2 weeks ago

Hell, I'm still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?

[–] [email protected] 0 points 2 weeks ago

All I want is more VRAM, it can already play all the games I want.

[–] [email protected] 0 points 2 weeks ago

Uhhh, I went from a Radeon 1090 (or whatever they're called, it's an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It's normal to not buy a GPU every year.

[–] [email protected] 0 points 2 weeks ago (1 children)

The progress is just not there.

I've got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn't care less about.

[–] [email protected] 0 points 2 weeks ago

bit better RT performance about which I couldn’t care less about.

Yeah raytracing is not really relevant on these cards, the performance hit is just too great.

The RX 9070 XT is the first AMD GPU where you can consider turning it on.

[–] [email protected] 0 points 2 weeks ago (2 children)

I have a 3080 and am surviving lol. never had an issue

[–] [email protected] 0 points 2 weeks ago (1 children)

Pretty wise, that's the generation before the 12HVPWR connectors started burning up.

[–] [email protected] 0 points 2 weeks ago

Afaik the 2080was the last FE with a regular PCIe power connector.

[–] [email protected] 0 points 2 weeks ago

Still running a 1080, between nvidia and windows 11 I think I'll stay where I am.

[–] [email protected] 0 points 3 weeks ago (2 children)

I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it'll be my right for at least 10 years.

But it was expensive.

[–] [email protected] 0 points 2 weeks ago

It seemed like horrible value at the time, but in hindsight a 4090 was not the worst investment, hah.

[–] [email protected] 0 points 2 weeks ago

Also built a dream machine in 2022. I have a 4090, a 7700X, 32GB of DDR5 6000, and 8TB of NVME storage. It's got plenty of power for my needs; as long as I keep getting 90+ FPS @ 4K and programs keep opening instantly, I'm happy. And since I bought into the AM5 platform right at the beginning of it, I can still upgrade my CPU in a few years and have a brand new, high end PC again for just a few hundred bucks.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago)

It's just because I'm not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I'm waiting until AMD gets a little better with ray tracing and switching to team red.

[–] [email protected] 0 points 3 weeks ago (1 children)

I stopped maintaining a AAA-capable rig in 2016. I've been playing indies since and haven't felt left out whatsoever.

[–] [email protected] 0 points 2 weeks ago (2 children)

Don't worry, you haven't missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles...

[–] [email protected] 0 points 2 weeks ago

The majority sure, but there are some gems though.

Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, ... for example

You can always wait for a couple of years before playing them, but saying they didn't miss anything is a gross understatement.

[–] [email protected] 0 points 2 weeks ago* (last edited 2 weeks ago)

It's funny, because often they aren't prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

[–] [email protected] 0 points 3 weeks ago

Unfortunately gamers aren't the real target audience for new GPUs, it's AI bros. Even if nobody buys a 4090/5090 for gaming, they're always out of stock as LLM enthusiasts and small companies use them for AI.

[–] [email protected] 0 points 3 weeks ago

Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

Yeah no shit, what a weird fucking take

[–] [email protected] 1 points 3 weeks ago (2 children)

For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It's crazy that anyone would even consider buying it unless they're rich or actually need it for something important.

[–] [email protected] 0 points 3 weeks ago (1 children)

unless they're rich or actually need it for something important

Fucking youtubers and crypto miners.

[–] [email protected] 0 points 2 weeks ago (1 children)

Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:

Fucking youtubers and AI.

[–] [email protected] 0 points 2 weeks ago

Fuck I'm old.

[–] [email protected] 0 points 3 weeks ago

plus, i have a 3060. and it's still amazing.

don't feel the need to upgrade at all.

[–] [email protected] 1 points 3 weeks ago (3 children)

The good games don't need a high end GPU.

[–] [email protected] 0 points 2 weeks ago

Clair obscur runs like shit on my 3090 at 4k :(

[–] [email protected] 0 points 3 weeks ago (1 children)

Problem is preordering has been normalized, as has releasing games in pre-alpha state.

[–] [email protected] 0 points 2 weeks ago

Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.

[–] [email protected] 0 points 3 weeks ago

Absolutely. True creative games are made by smaller dev teams that aren't forcing ray tracing and lifelike graphics. The new Indianna Jones game isn't a GPU-selling card, and is the only game that I've personally had poor performance on with my 3070ti at 1440p.