this post was submitted on 15 Jun 2025
2 points (100.0% liked)
PC Gaming
11644 readers
553 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
4K is an outrageously high resolution.
If I was conspiratorial I would say that 4K was normalized as the next step above 1440p in order to create a demand for many generations of new graphics cards. Because it was introduced long before there was hardware able to use it without serious compromises. (I don't actually think it's a conspiracy though.)
For comparison, 1440p has 78% more pixels than 1080p. That's quite a jump in pixel density and required performance.
4K has 125% more pixels than 1440p (300% more than 1080p). The step up is massive, and the additional performance required is as well.
Now there is a resolution that we are missing in between them. 3200x1800 is the natural next step above 1440p*. At 56% more pixels it would be a nice improvement, without an outrageous jump in performance. But it doesn't exist outside of a few laptops for some reason.
*All these resolutions are multiples of 640x360. 720p is 2x, 1080p is 3x, 1440p is 4x, and 4K is 6x. 1800p is the missing 5x.