this post was submitted on 16 Feb 2024
410 points (100.0% liked)

196

16245 readers
2337 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 7 months ago (3 children)

On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.

*At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.

[–] [email protected] 3 points 7 months ago

On PS5 Hogwarts Legacy runs at 60fps but the cutscenes are 30fps.

[–] [email protected] 4 points 7 months ago

I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.

[–] [email protected] 3 points 7 months ago (1 children)

Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?

But how? Does the game use FMV for the cutscenes or something?

[–] [email protected] 5 points 7 months ago

The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you're in a cutscene-- it was grainier and there was a smidge of artifacting. Don't quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)

Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I'll have to replay that game again sometime to jog my memory.