Ray tracing is still prolly fecade away for mainstream too
Tech has been fake promises at least since covid
Nothing really changed practically for gaming since peak 2015 period.
Just more MTX and scamming
Welcome to the Unpopular Opinion community!
How voting works:
Vote the opposite of the norm.
If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.
Guidelines:
Tag your post, if possible (not required)
Rules:
1. NO POLITICS
Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.
2. Be civil.
Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Shitposts and memes are allowed but...
Only until they prove to be a problem. They can and will be removed at moderator discretion.
5. No trolling.
This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.
6. Defend your opinion
This is a bit of a mix of rules 4 and 5 to help foster higher quality posts. You are expected to defend your unpopular opinion in the post body. We don't expect a whole manifesto (please, no manifestos), but you should at least provide some details as to why you hold the position you do.
Instance-wide rules always apply. https://legal.lemmy.world/tos/
Ray tracing is still prolly fecade away for mainstream too
Tech has been fake promises at least since covid
Nothing really changed practically for gaming since peak 2015 period.
Just more MTX and scamming
Oh, believe me, I agree.. I agree so hard thats a worthy of a whole different post, lol.
Its capable of making pretty screenshots. But ultimately its a pointless tax that serves no real purpose besides artificially increasing the price of GPUs. . . because what better way to increase the price of a GPU than to start tacking other features onto it, Right nVidia?
Trust me bro, a few hundred more billion dollars worth of R&D spread out between 37 companies that keep taking turns to buy each other out in hopes of making a trillion dollars on some slop somebody vomited out of the dark recesses of their souls over the next 59 years and it will get better I promise trust me bro
Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.
That's already been proven false back when DLSS 2.0 released.
No it hasnt, You are just regurgitating nvidia's marketing.
You can't stretch a picture and have it look just as good as natively rendering it at that higher resolution.
You can not create something from nothing. No matter how much AI guesswork you put into filling in the gaps, it will never be as good as just rendering it at the larger res. It will never look as good at the original resolution pre-AI stretching either.
You are just regurgitating nvidia's marketing.
No, this is general not only general consensus, but it's measurably better when comparing SNR.
You can personally hate it for any reason you want.
But it doesn't change the fact that AI up-scaling produces a more accurate result than native rendering.
I don't understand. This isn't really a subject I care much about, so forgive my ignorance.
Are you saying that an AI generated frame would be closer to the actual rendered image than if the image rendered natively? Isn't that an oxymoron? How can a guess at what the frame will be be more 'accurate' than what the frame would actually be?
They did in fact say that, and that is in fact nonsense, verifiable in many ways.
Perhaps they misspoke, perhaps they are misinformed but uh...
Yeah, it is fundamentally impossible to do what he actually described.
Intelligent temporal frame upscaling is getting better and better at producing a frame that is almost as high quality as a natively rendered frame, for less rendering time, ie, higher fps... but its never going to be 'better' quality than an actual native render.
It's using information from multiple frames, as well as motion vectors, so it's not just blind guesses.
And no, it's not as good as a 'ground truth' image, but that's not what it's competing against. FXAA and SMAA don't look great, and MSAA has a big performance penalty while still not eliminating aliasing. And I think DLSS quality looks pretty damn good. If you want something closer to perfect, there's DLAA, which is comparable to SSAA, without nuking your framerate. DLSS can match or exceed visual fidelity at every level, while offering much better performance.
Frame gen seems like much more of a mixed bag, but I think it's still good to have the option. I haven't tried it personally, but I could see it being nice in single player games to go from 60 -> 240 fps, even if there's some artifacting. I think latency would become an issue at lower framerates, but I don't really consider 30 fps to be playable anyway, at least for first person games.
And yes, it has been used to excuse poor optimization, but so have general hardware improvements. That's an entirely separate issue, and doesn't mean that upscaling is bad.
Also I think Nvidia is a pretty anti-consumer company, but that mostly has to do with business stuff like pricing. Their tech is quite good.
Eh... The latest versions of DLSS and FSR are getting much better image quality in stills...
But they also still are not as good image quality as actually rendering the same thing, natively, at full resolution, as was the quote you are disputing.
Further, the cards that can run these latest upscsling techs, to reach 4k60fps, 4k90fps, in very demanding games, without (fake) frame gen?
Its not as as bad with AMD, but they also don't yet offer as high calibre a GPU as Nvidia's top end stuff (though apparently 9080 XT rumors are starting to float around)...
But like, the pure wattage draw of a 5080 or 5090 is fucking insane. A 5090 draws up to 575 watts, on its own.
You can make a pretty high powered 1440p system if you use the stupendously high cpu performance per watt, high powered 9745hx or 9745hx3d cpu + mobo combos that minisforum makes... and the entire PSU for the entire system shouldn't need to exceed 650 watts.
... A 5090 alone draws nearly as much power as basically the one resolution step down system.
This, to me, is completely absurd.
Whether or not you find the power draw difference between an 'ultra 1440p' build and an 'ultra 4k' build ridiculous... the price point difference between the those pcs and monitors is... somewhere between 2x and 3x as expensive, and hopefully we can agree that that in fact is ridiculous, and 4k, high fidelity gaming remains far out of the reach of the vast majority of pc gamers.
EDIT:
Also, the vast majority of your comment is comparing native + some AA algo to... rendering at 75% to 95% and then upscaling.
For starters, again the original comment was not talking about native + some AA, but just native.
Upscaling introduces artefacts and innacuracies, such as smudged textures, weird ghosting that resembles older, crappy motion blur techniques, loss of lod style detail for distsnt objects, sometimes gets confused between HUD elements and the 3d rendered scene and warps them together...
Just because intelligent temporal upscaling also produces what sort of look like, but isn't actually AA... doesn't mean it does not have these other costs of achieving this 'AA' in a relatively sloppy manner that also degrades other elements of the finished render.
Its a tradeoff between an end result at the same res that is worse, to some degree, but rendered faster, to some degree.
Again, the latest versions of intelligent upscalers are getting better at getting the quality closer to a native render while maintaining a higher fps...
But functionally what this is, is an overall 'quality' slider that is basically outside of or on top of all of a games other, actual quality settings.
It is a smudge factor bandaid that covers up poor optimization within games.
And that poor optimization is, in almost all cases... real time ray tracing/path tracing of some kind.
A huge chunk of what has driven and enabled the development of higher fidelity, high fame rate rendering in the last 10 or 15 years has been figuring out basically clever tricks and hacks in your game design, engine design, and rendering pipeline, that make it so realtime lighting is only used where it absolutely needs to be used, in a very optimized way.
Then, about 5 years ago, most AAA game devs/studios just stopped doing those optimizations and tricks, as a cost cutting measure in development... because 'now the hardware can optimize automagically!'
No, it cannot, not unless you think all PC gamers have a $5,000 dollar rig.
A lot of this is tied to UE 5 being an increasingly popular, but also increasingly shit optimized engine.
To concretize:
And about 4k gaming ... play old games, with new GPUs. RDR2 works quite good on my 7800XT. More than 60FPS, which is very enough if you aren't a speedrunner or similar. No FSR, of course.
One important thing - upscaling does help with low spec/low power gaming (esp on smaller screens). Obviously it's a double edged sword (promotes pushing out games quicker), but it has some really cool uses. Now forced TAA on the other hand...
Both are tolerable, but only if they're not forced, and for some reason companies have a hard-on for forcing them. Kinda like how 103° FOV limit somehow became a standard even in fast-paced competitive games
Absolutely true. I never bother to turn these options on if a game offers them because in the best case it doesn't do a whole lot and in the worst case it makes the game look awful. I'd rather just play with real frames even if it means playing at a lower frame rate.