Citizen Cook Sep 23, 2023 @ 12:41am
Nvidia — “Fake Frames” is dead!
In the latest DF video, Nvidia’s Cantanzaro says “Moore's Law is dead. We don't know as a civilisation how to keep turning the crank on traditional ways of doing things. We have to be smarter.
You fundamentally realise you have to be more intelligent about the graphics rendering process. Brute force—let's re-render every frame 120 times a second at 2160p output—that is wasteful because we know that there are a lot of correlations in the output of any rendering process.
We know that there are a lot of opportunities to be smarter, to reuse compute. And then deliver transformational image quality benefits, things like Cyberpunk, that we could never have imagined doing before.
DLSS 3.5 makes Cyberpunk even more beautiful than native rendering. The reason for that is because the AI is able make smarter decisions about how to render the scene than what we knew without AI. I would say that Cyberpunk frames using DLSS and Frame Generation are much realer than traditional graphics frames.
Raster is a bag of fakeness. We get to throw that out and start doing path tracing and actually get real shadows and real reflections. And the only way we do that is by synthesising a lot of pixels with AI.

https://www.youtube.com/watch?v=Qv9SLtojkTU

This is all very interesting. One of the things I thought about is how one earth can AMD compete? They are light years behind Nvidia beyond simple raster performance. You can’t even benchmark because one teams card displays visuals completely differently and achieves frame rate differently.
< >
Showing 31-45 of 52 comments
wesnef Sep 24, 2023 @ 8:30am 
Originally posted by Citizen Cook:
Originally posted by Tiberius:

Not everyone lives in your alternate universe where every game has dlss support

You’re missing the point. How do you compare an Nvidia card with all those amazing features against an AMD card without them? And you can’t just test the Nvidia card with those features turned off because no one who would buy them would do that.

Why can't you? Doing standard testing of the hardware without any 'special features' turned on is exactly how tests of other machines & physical products works.
Tiberius Sep 24, 2023 @ 8:34am 
Originally posted by wesnef:
Originally posted by Citizen Cook:

You’re missing the point. How do you compare an Nvidia card with all those amazing features against an AMD card without them? And you can’t just test the Nvidia card with those features turned off because no one who would buy them would do that.

Why can't you? Doing standard testing of the hardware without any 'special features' turned on is exactly how tests of other machines & physical products works.

Rtx4060ti is the best budget card, because it has dlss 3.0

Said no one ever
Ocelote.12 Sep 24, 2023 @ 9:46am 
Originally posted by Preacher:
I go outside if I want realistic graphics and it doesnt cost any money

I want realistic tropical jungle but without the heat and pests and wild venomous and dangerous animals, so I can't just "go outside", I run Crysis Remastered instead.

Also, it's cheaper to spend money once in several years on a graphics card and visit the virtual jungle any day you want than to regularly spend money on tickets/lodging in tropical locations.
Last edited by Ocelote.12; Sep 24, 2023 @ 9:59am
Azza ☠ Sep 24, 2023 @ 3:24pm 
Have you ever heard about 'smear frames' in animation / cartoons?

In animation, a smear frame is a frame used to simulate motion blur. A character might turn quickly, so a frame in between the turn is actually a blur of that character.

https://www.youtube.com/watch?v=vMvJiKu-93g

I consider Nvidia AI "fake frames" a similar but much better version of that. A human brain doesn't even notice it unless they are seeking it out.

Anyways, DLSS 3.0 was a requirement to get double the FPS without a performance lost for 4K resolution and up. Without it, I wouldn't be able to get 60 FPS up to even 120 FPS at 4K with ultra high settings. It software improves over time with the latest Nvidia drivers towards the latest games, to the point it even can look better than the original frame. It will over time become a free hardware upgrade as such.

And believe it has it been improving greatly over time!

https://www.youtube.com/watch?v=uVCDXD7150U
Last edited by Azza ☠; Sep 24, 2023 @ 3:30pm
Citizen Cook Sep 25, 2023 @ 9:40am 
Originally posted by PopinFRESH:
Originally posted by Tiberius:

Imagine buying a gpu for dlss 3.5. A feature thats only available for one game currently. And i'm the one whos missing the point here?

L O L

whooosh...

😂 I feel like he hasn’t watched the video or read what was said. The world has changed around him and he refuses to look at the evidence of it happening.
Citizen Cook Sep 25, 2023 @ 9:42am 
Originally posted by wesnef:
Originally posted by Citizen Cook:

You’re missing the point. How do you compare an Nvidia card with all those amazing features against an AMD card without them? And you can’t just test the Nvidia card with those features turned off because no one who would buy them would do that.

Why can't you? Doing standard testing of the hardware without any 'special features' turned on is exactly how tests of other machines & physical products works.

Good grief. We’re way beyond that now for the reasons I said and you quoted but fail to comprehend.
Azza ☠ Sep 25, 2023 @ 1:06pm 
Originally posted by Citizen Cook:
Originally posted by wesnef:

Why can't you? Doing standard testing of the hardware without any 'special features' turned on is exactly how tests of other machines & physical products works.

Good grief. We’re way beyond that now for the reasons I said and you quoted but fail to comprehend.

Last time I counted there was over 750+ games supported so far. Around 240 of them being DLSS 3.0+ and increasing each and every single Nvidia game driver release, plus improving over time too the previous ones as they run upon an Nvidia AI upscaling server farm 24/7 upon super computers.

Both Deep Learning Super Sampling (DLSS) and Ray Tracing is being advanced greatly into the Nvidia RTX series, via not only hardware but software upon AI servers, pumping it back into the driver updates. If you only care to update them and tend to play the latest games.

You can even rent your own Nvidia enterprise server yourself to do deep learning training. Game developers can add to it themselves, if desired and tweak the quality/performance.

Branched off from that is the Nvidia GeForce NOW cloud gaming service, in which you can also rent to stream gaming from a high-end server with RTX 3080 graphic cards to stream it to your potato PC if you have the bandwidth (45Mbps) for up to 4K @ 120 FPS (max).
Last edited by Azza ☠; Sep 25, 2023 @ 1:15pm
PopinFRESH Sep 25, 2023 @ 2:13pm 
Originally posted by Azza ☠:
...
Branched off from that is the Nvidia GeForce NOW cloud gaming service, in which you can also rent to stream gaming from a high-end server with RTX 3080 graphic cards to stream it to your potato PC if you have the bandwidth (45Mbps) for up to 4K @ 120 FPS (max).

Technically they aren't "high-end servers with an RTX 3080". They are virtualized environments running with "3080-class" power; which that is actually now "4080-class" power. They are actually running on a partitioning GPU within their "SuperPods" (e.g. clusters of DGX A100 units).

But your point regarding GeForce Now and being able to play current gen games on a potato is accurate.
Azza ☠ Sep 25, 2023 @ 2:21pm 
Originally posted by PopinFRESH:
Originally posted by Azza ☠:
...
Branched off from that is the Nvidia GeForce NOW cloud gaming service, in which you can also rent to stream gaming from a high-end server with RTX 3080 graphic cards to stream it to your potato PC if you have the bandwidth (45Mbps) for up to 4K @ 120 FPS (max).

Technically they aren't "high-end servers with an RTX 3080". They are virtualized environments running with "3080-class" power; which that is actually now "4080-class" power. They are actually running on a partitioning GPU within their "SuperPods" (e.g. clusters of DGX A100 units).

But your point regarding GeForce Now and being able to play current gen games on a potato is accurate.

I personally disliked the concept at first with the new RTX 4090, till I actually tried it via my work and purchased it whole sale price. It makes so much sense now for the leap to 4K HDR with Ray Tracing quality. It's pretty much a requirement to get there unless you want to wait for years longer. Given that 4K is 1080p times four to juice it, plus real-time ray tracing would cut your FPS down to 21 FPS without DLSS 3 to support it back up to 60-120 FPS.

For people still happy on 1080p or 1440p resolution, it wouldn't make much different or sense. Yet for future generations, I can understand their path. The only other way to go about it is via SLI multiple graphic cards together which is a power drain and high cost.
Last edited by Azza ☠; Sep 25, 2023 @ 2:24pm
Tiberius Sep 25, 2023 @ 3:07pm 
Originally posted by Azza ☠:
Originally posted by Citizen Cook:

Good grief. We’re way beyond that now for the reasons I said and you quoted but fail to comprehend.

Last time I counted there was over 750+ games supported so far. Around 240 of them being DLSS 3.0+ and increasing each and every single Nvidia game driver release, plus improving over time too the previous ones as they run upon an Nvidia AI upscaling server farm 24/7 upon super computers.

Both Deep Learning Super Sampling (DLSS) and Ray Tracing is being advanced greatly into the Nvidia RTX series, via not only hardware but software upon AI servers, pumping it back into the driver updates. If you only care to update them and tend to play the latest games.

You can even rent your own Nvidia enterprise server yourself to do deep learning training. Game developers can add to it themselves, if desired and tweak the quality/performance.

Branched off from that is the Nvidia GeForce NOW cloud gaming service, in which you can also rent to stream gaming from a high-end server with RTX 3080 graphic cards to stream it to your potato PC if you have the bandwidth (45Mbps) for up to 4K @ 120 FPS (max).

ahh my favorite fairytale: another alternate universe where 240+ games support dlss3.0
Tiberius Sep 25, 2023 @ 3:21pm 
literally the same ppl 3 months ago

https://youtu.be/rGBuMr4fh8w?si=7Hf1uqj9lseqM4QP

but yea.. dlss3.0++ is the bestest eva :steambored:
Last edited by Tiberius; Sep 25, 2023 @ 3:23pm
Citizen Cook Sep 26, 2023 @ 12:28am 
Originally posted by Tiberius:
literally the same ppl 3 months ago

https://youtu.be/rGBuMr4fh8w?si=7Hf1uqj9lseqM4QP

but yea.. dlss3.0++ is the bestest eva :steambored:

Wow, you should work for government with spin like that.
The card is extremely overpriced and low on VRAM. That has nothing to do with DLSS 3.5 and the game changing benefits that it brings.
Tiberius Sep 26, 2023 @ 2:36am 
Originally posted by Citizen Cook:
Originally posted by Tiberius:
literally the same ppl 3 months ago

https://youtu.be/rGBuMr4fh8w?si=7Hf1uqj9lseqM4QP

but yea.. dlss3.0++ is the bestest eva :steambored:

Wow, you should work for government with spin like that.
The card is extremely overpriced and low on VRAM. That has nothing to do with DLSS 3.5 and the game changing benefits that it brings.

Its as if they're trying to say ultimately, rasterization performance is what matters, not some propietary features that (very very) few games have. There are only so much fake frames can be made from 10fps gaming. 🤯

But then again im talking to someone who think the gaming world has fully embraced frame generation. You should browse your own library and see how many games actually support it
Citizen Cook Sep 26, 2023 @ 3:03am 
Originally posted by Tiberius:
Originally posted by Citizen Cook:

Wow, you should work for government with spin like that.
The card is extremely overpriced and low on VRAM. That has nothing to do with DLSS 3.5 and the game changing benefits that it brings.

Its as if they're trying to say ultimately, rasterization performance is what matters, not some propietary features that (very very) few games have. There are only so much fake frames can be made from 10fps gaming. 🤯

But then again im talking to someone who think the gaming world has fully embraced frame generation. You should browse your own library and see how many games actually support it

Please, watch this video. Start at 11:37

https://www.youtube.com/watch?v=T_vbDx-sBg8&t=676s

You’ll see why their thinking is outdated.
Tiberius Sep 26, 2023 @ 3:39am 
Originally posted by Citizen Cook:
Originally posted by Tiberius:

Its as if they're trying to say ultimately, rasterization performance is what matters, not some propietary features that (very very) few games have. There are only so much fake frames can be made from 10fps gaming. 🤯

But then again im talking to someone who think the gaming world has fully embraced frame generation. You should browse your own library and see how many games actually support it

Please, watch this video. Start at 11:37

https://www.youtube.com/watch?v=T_vbDx-sBg8&t=676s

You’ll see why their thinking is outdated.


There are so many presumptions in that video. They talk abt a feature that a game might or might not get, or an update (for future dlss ver) that a game might or might not get. (or even the card, ie: dlss3.0 and rtx 2xxx&3xxx). At that point, the reviews are just selling nvidia promises.
< >
Showing 31-45 of 52 comments
Per page: 1530 50

Date Posted: Sep 23, 2023 @ 12:41am
Posts: 52