this post was submitted on 28 Feb 2025
1 points (100.0% liked)

Gaming

4217 readers
39 users here now

!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.

Our Rules:

1. Keep it civil.


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.


2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.


I should not need to explain this one.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Try not to repost anything posted within the past month.


Beyond that, go for it. Not everyone is on every site all the time.



Logo uses joystick by liftarn

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 3 weeks ago (1 children)
[–] [email protected] 0 points 3 weeks ago (1 children)

A cutscene isn't the best representation. This shows off the 8-bit vs 16-bit better.

[–] [email protected] 0 points 3 weeks ago (1 children)

I mean, the original image is a cutscene, so...

But hey, I'll split the difference. Instead of SMB 1, which was a launch game and literally wasn't running on the same hardware (because mappers), we can do Mario 3 instead.

Or, hear me out, let's not do a remaster at all for current gen leaps. Here's a PS4 vs PS5 sequel one.

It doesn't work as well, though, since taking the absolutely ridiculous shift from 2D to 3D, which has happened once and only once in all of gaming history, is a bit of a cheat anyway.

Oh, and for the record, and I can't believe I'm saying this only now, LttP looks a LOT better than OoT. Not even close.

[–] [email protected] 0 points 3 weeks ago (1 children)

Oh I don't care about leap comparisons, was just taking interest at how graphics have evolved over time. To be honest graphics have been going downhill for a few years now in big games thanks to lazy development chasing "good" graphics, fucking TAA...

[–] [email protected] 0 points 3 weeks ago (1 children)

I agree that it's a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.

I don't know that I agree on the rest. I don't think I'm aware of a lazy game developer. That's a pretty rare breed. TAA isn't a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.

I do believe we are in a very weird quagmire of a transitional period, where we're using what is effectively now a VFX suite to make games that aren't meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It's a mess out there and it'll continue to be a mess, because the days of a 1080Ti being a "set to Ultra and forget it" deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.

It's not the only time we've been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it's up there.

[–] [email protected] 0 points 3 weeks ago

TAA is absolutely a bad thing, I'm sorry, but it's way worse than FXAA, especially when combined with the new ML upscaling shit.
It's only really a problem with big games or more specifically UE5 games as temporal is baked into it.

Yeah, there was that perfect moment in time where you could just put everything max, have some nice SMAA on and be happy with >120fps. The 4K chase started yeah, but the hardware we have now is ridiculously powerful and could run 4K 120fps no problem natively, if the time was spent achiveing that rather than throwing in more lighting effects no one asked for, speed running development and then slapping DLSS on at the end to try and reach playable framerates, making the end product a blurry ghosting mess. Ugh.