this post was submitted on 23 Nov 2024
97 points (90.1% liked)

PC Gaming

8625 readers
1017 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 15 hours ago (2 children)

I've seen a lot of complaints online about the game's performance. But my 'okay' computer is handling the game at max settings just fine. I'm kinda of confused. Is it because I'm using Linux?

[–] [email protected] 0 points 13 hours ago

I'm using Linux with a Radeon 7900 XTX and I can't get over 120 fps

[–] [email protected] 3 points 13 hours ago

But my ‘okay’ computer is handling the game at max settings just fine.

Yeah, that's the issue.

Your comp is running maxed setting at what you consider a serviceable framerate, while admitting your PC is just "okay".

Everyone with a better comp than you, is also running at max setting, and seeing the graphics you are at probably close to the same average frames and dips. But we're used to better graphics at higher frame rates with zero stutter/dips.

I've talked about this issue in the past, and it's hard to explain. But a properly optimized game shouldn't really run with everything maxed out on release except the very top hardware setup.

What's currently max setting should be "medium" settings, because lots of people can handle it.

Your experience wouldn't change at all, there'd just be the higher graphical settings available for people who could run them.

Think of it like buying the game on PS5 Pro, and then finding out that it plays exactly the same on the PS4. It's not that you'd be mad that the PS4 people get a playable version, it's that you don't understand why that's comparable to the newest gen console version. And compared to games that use your PS5 pro's full power, it's going to seem bad.

People (myself included) just assumed since it was UE5, they'd be at least giving us the options that UE5 was updated to support.

It seems they did it for future proofing the game, which 100% makes sense. Hopefully they add that stuff in with updates later.

Like, it doesn't support hardware ray tracing...

And it doesn't have non ray based lighting either. It forces everything to software ray tracing, which is a huge performance hit to people with hardware that can do ray tracing, but is completely unnoticeable to people with hardware that can't do ray tracing. They may even see better graphics than a game that uses traditional lighting.

Like. I'm just a hobbyist nerd, I don't really know all the in and outs of what's going on with Stalker 2. But it seems like this is just a game that caters to the average PC gamer to the point everyone with an above average PC wasn't even an afterthought.

I'm sure there's going to be a lot of people who know more than me looking at lot closer at why the reaction to this game has been so varied.