I have a 4K 120hz OLED TV. The difference is quite drastic compared to my old 1080p LED. It's certainly sharper, and probably the practical limit. I've also seen 8K, and, meh. I don't even care if it's noticable, it's just too expensive to be worthwhile. We should just push more frames and lower latency for now, or, the Gods forbid, optimise games properly.
memes
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
I feel like resolution wasn't much of an issue even at 1080p. It was plenty. Especially at normal viewing distances.
The real advantages are things like HDR and higher framerates including VRR. I can actually see those.
I feel like we're going to have brighter HDR introduced at some point, and we'll be forced to upgrade to 8K in order to see it.
1080P VA panels FTW!
One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a 'smart' TV and neither connects to the internet.
I will use them until they can no longer be used.
CRT for life
We are at a point where 4k rtx is barely viable if you have a money tree.
Why the fuck would you wanna move to 8k?
I'm contemplating getting 1440p for my setup, as it seems a decent obtainable option.
8k 15fps will be glorious.
lol
How many Ks is real life resolution and at how many fps does it run?
Whatever the resolution of 'real life', what matters is at what point our little eyes and brains no longer can perceive a difference.
In average scenery, the general consensus is about 60 pixels per degree of vision. If you have something a bit more synthetic, like a white dot in empty space, then that sort of specific small high contrast would take maybe 200 pixels per degree to ensure that the white dot is appropriately equally visible in the display versus directly seeing. A 75" display 2 meters out at 4k is about 85 pixels per degree. This is comfortable enough for display.
Similar story with 'frames per second'. Move something back and forth really fast and you'll see a blurry smear of the object rather than observing it's discrete movement. So if you accurately match the blurring you will naturally see and do low persistence backlight/display, you'll get away with probably something like 60 FPS. If you are stuck with discrete representations and will unable to blur or turn off between meaningful frames, you might have to go a bit further up, to like 120 or 144 FPS.
The question isn't how high the resolution of reality is, but how well we can process it is. There is an upper limit to visual acuity, but I'd have to calculate what an arc-minute at 6 meters would be and I'm too lazy right now. Regarding fps, some people can notice artefacts up to 800hz, but I'd think going with 120hz would be ok. Remember, you'll have to generate stereoscopic output.
I feel like it's kinda infinite, because you can zoom in to the quantum level and then looking at things sorta fails you... But I'm no scientist.
I think it's about 10^44^ fps, give or take.