this post was submitted on 16 Sep 2024
110 points (92.3% liked)

Games

16403 readers
1383 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 112 points 3 days ago* (last edited 3 days ago) (4 children)

Gee, we've had over a half century of computer graphics at this point. However, suddenly when a technology arises that requires obscene amount of GPU's to generate a results a GPU manufacturer is here to tell us that all computer graphics without that new technology is dead for... reasons. I cannot see any see any connections between these points.

[–] [email protected] -2 points 3 days ago

I think what he means is that AI is needed to keep making substantial improvements in graphic quality, and he phrased it badly. Your interpretation kind of presumes he's not only lying, but that he thinks we're all idiots. Given that he's not running for office as a Republican, I think that's a very flawed assumption.

[–] [email protected] 11 points 3 days ago (2 children)

What do you mean "suddenly"? I was running path tracers back in 1994. It's just that they took minutes to hours to generate a 480p image.

The argument is that we've gotten to the point where new rendering features rely on a lot more path tracing and light simulation that used to not be feasible in real time. Pair that with the fact that displays have gone from 1080p60 vsync to 4K at arbitrarily high framerates and... yeah, I don't think you realize how much additional processing power we're requesting.

But the good news is if you were happy with 1080p60 you can absolutely render modern games like that in a modern GPU without needing any upscaling.

[–] [email protected] 12 points 3 days ago (1 children)

I think you just need to look at the PS5 Pro as proof that more GPU power doesn't translate linearly to better picture quality.

The PS5 Pro has a 67% beefier GPU than the standard PS5 - with a price to match - yet can anyone say the end result is 67% better? Is it even 10% better?

We've been hitting diminishing returns on raw rasterising for years now, a different approach is definitely needed.

[–] [email protected] 3 points 3 days ago

Yeah, although I am always reluctant to quantify visual quality like that. What is "65% better" in terms of a game playing smoothly or looking good?

The PS5 Pro reveal was a disaster, partially because if you're trying to demonstrate how much nicer a higher resolution, higher framerate experience is, a heavily compressed, low bitrate Youtube video that most people are going to watch at 1080p or lower is not going to do it. I have no doubt that you can tell how much smoother or less aliased an image is on the Pro. But that doesn't meant the returns scale linearly, you're right about that. I can tell a 4K picture from a 1080p one, but I can REALLY tell a 480p image from a 1080p one. And it's one thing to add soft shadows to a picture and another to add textures to a flat polygon.

If anything, gaming as hobby has been a tech thing for so long that we're not ready to have shift to being limited by money and artistic quality rather than processing power. Arguably this entire conversation is pointless in that the best looking game of 2024 is Thank Goodness You're Here, and it's not even close.

[–] [email protected] 5 points 3 days ago (1 children)

Yeah, there's a reason any movie attempting 3D CG with any budget at all has used path tracing for years. It's objectively massively higher quality.

You don't need upscaling or denoising (the "AI" they're talking about) to do raster stuff, but realistic lighting does a hugely better job, regardless of the art style you're talking about. It's not just photorealism, either. Look at all Disney's animated stuff. Stuff like Moana and Elemental aren't photorealistic and aren't trying to be, but they're still massively enhanced visually by improving the realism of the behavior of light, because that's what our eyes understand. It takes a lot of math to handle all those volumetric shots through water and glass in a way that looks good.

[–] [email protected] 3 points 3 days ago

Yep. The thing is, even if you're on high end hardware doing offline CGI you're using these techniques for denoising. If you're doing academic research you're probably upscaling with machine learning.

People get stuck on the "AI" nonsense, but ultimately you need upscaling and denoising of some sort to render certain tier of visuals. You want the highest quality version of that you can fit in your budgeted frame time. If that is using machine learning, great. If it isn't, great as well. It's all tensor math anyways, it's about using your GPU compute in the most efficient way you can.

[–] [email protected] -5 points 3 days ago* (last edited 3 days ago) (2 children)

And I don't get why they would use it for graphics instead of using an AI coprocessor to do interesting stuff in games, like generating dialogue, complex missions, smarter NPCs, maps, etc.

You could build worlds where stuff happens that isn't just governed by randomly doing stuff based on triggers.

[–] [email protected] 4 points 3 days ago

Because as complex and hard to decipher as it is, all AI does is inference. Doing inference with graphics is something that has been perfected through decades and is worked on heavily.

Quests and dialogues rely more on the creative thinking of writers, having a satisfying side quest is quite hard and leaving a text generator engine that task is a huge pitfall. It has it's use in generating some bulk texts that can then be proofread, but generating text live? Hell no. Plus, what about voice acting? Will you also steal actors voices so that the garbage text generated is said by "not scarlet johansson" v3? If you think AI will generate smarter behaviours your definition of smart and mine really differ. Will AI be able to rig animations and infer correct rig positions so that the NPC features line up with what they are saying? Mocap and voice acting is done heavily in big productions to get that.

[–] [email protected] 10 points 3 days ago

Oh, now you're wrong. AI upscaling is demonstrably more accurate than plain old TAA, which is what we used to use in the previous generation. I am NOT offloading compelling NPC dialogue to a crappy chatbot. Every demo I've seen for that application has been absolutely terrible.

[–] [email protected] 6 points 3 days ago (2 children)

Devils advocate: Splatting, dlss, neural codecs to name a few things that will change the way we make games

[–] [email protected] 5 points 3 days ago

Ray tracing actually will directly change the way games are made. A lot of time is spent by artists placing light sources and baking light maps to realistically light scenery - with ray tracing, you get that realism "for free".

DF did a really interesting video on the purely path traced version of Metro: Exodus and as part of that, the artists talked about how much easier and faster it was to build that version.

[–] [email protected] 7 points 3 days ago (1 children)

DLSS doesn’t work that well. I’m not looking forward to AI replacing artist’s work.

[–] [email protected] 9 points 3 days ago (1 children)

I'm not sure I agree with you on the former, DLSS is pretty remarkable in its current iteration

[–] [email protected] 9 points 3 days ago

Agreed, things like DLSS are the right kind of application of AI to games, same with frame generation. The wrong kind is trying to figure out how to replace developers, artists of every kind, actors, etc in the production process with AI. That being said though, companies like Nvidia absolutely can and will profit off making sure that a game cannot run well on anything but the latest hardware that they sell, so the whole “you need to buy our stuff to play games because it has the good ai and now all games require the good ai” is capitalist bullshit