This topic has been locked
Napoleonic S Sep 29, 2022 @ 8:02am
Does DLSS reduce VRAM usage?
Compared to native res... I'm confused... I assume it should since it make your GPU renders at lower res... But I haven't seen any definitive answer on this matter.

Is there anyone on the internet that tested this extensively?
< >
Showing 1-15 of 22 comments
nullable Sep 29, 2022 @ 8:10am 
It seems plausible.

But sometimes when you're asking a question that no one has bothered to answer it raises questions about the value of the question. At least beyond the random curiosity of the thing.

After if it doesn't lower VRAM usage, so what? DLSS probably isn't a way to say ignore minimum requirements for a game. And if it does, so what? You already had enough to run the game without DLSS.

After all the specifics would depend on the game AND what level of DLSS you're using. So there wouldn't be a concrete answer. And the answer may end up being it uses less, except when it doesn't.

If you have a DLSS enabled card, what's stopped you from testing a few games to see if you can see obvious differences that would make it an interesting enough subject to pursue further? It may just not be an interesting enough subject for most content producers.
Last edited by nullable; Sep 29, 2022 @ 8:12am
Cathulhu Sep 29, 2022 @ 9:05am 
Theoretically it should reduce VRAM usage, but should be negligible compared to other stuff that requires VRAM.

DLSS exists to make it less computing intensive to render an image.
Napoleonic S Sep 29, 2022 @ 5:42pm 
Originally posted by Snakub Plissken:
It seems plausible.

But sometimes when you're asking a question that no one has bothered to answer it raises questions about the value of the question. At least beyond the random curiosity of the thing.

After if it doesn't lower VRAM usage, so what? DLSS probably isn't a way to say ignore minimum requirements for a game. And if it does, so what? You already had enough to run the game without DLSS.

After all the specifics would depend on the game AND what level of DLSS you're using. So there wouldn't be a concrete answer. And the answer may end up being it uses less, except when it doesn't.

If you have a DLSS enabled card, what's stopped you from testing a few games to see if you can see obvious differences that would make it an interesting enough subject to pursue further? It may just not be an interesting enough subject for most content producers.
Well the thing is I often hit my 8 GB VRAM wall with my 2070 super playing at 3440x1440 res, so I wonder if using DLSS would lower the VRAM usage on games that supports it, I think it lowers it a bit although I didn't really pay attention that much, after all I'm trying to play games not benchmarking them...

With Nvidia being stingy about VRAM on their high end GPUs, I thought this would've been more popular of a topic...
nullable Sep 29, 2022 @ 6:43pm 
Well if you look at the primary display resolution section, https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

You can kinda see that common resolutions above 2560x1440, combined is less than 2560x1440's share and even that is about 1/6th of 1080p's share.

2560x1440 (which is what I run) = 3,686,400 pixels

3440x1440 = 4,953,600 pixels

Basically you're rendering ~33% more pixels than I am. And 2.38 times as many pixels as 1080p. And basically a lot more pixels than like 92% of gamers. So it's not really a mainstream issue.

And perhaps running a 2070 at maybe a smidge above its weight class (for some games) isn't an appealing enthusiast issue. Should the 2070 have had more RAM? I dunno.

I went from a 1080 ti 11GB to a 2080 Super 8GB and yeah, on some level, the idea of having less VRAM feels irksome. I had a GeForce 970 too and year the 3.5GB/500MB fast/slow RAM felt irksome too. But in either case I wasn't running at high enough resolution or demanding enough games (I guess) where VRAM was an issue, so it felt more like an edge case problem. Something people could complain about but would rarely be affected.

And arguably the 2070's RAM amount would be sufficient for a majority of gamers. I didn't have RAM issues on my 2080 Super at 2560x1440 and maybe that's the line, 80% of people are running a resolution lower than that, a significant majority. But again, should the 2070 have had more RAM? I dunno for sure, and my experience presents some biases.
Last edited by nullable; Sep 29, 2022 @ 6:46pm
Napoleonic S Sep 29, 2022 @ 8:32pm 
Originally posted by Snakub Plissken:
Well if you look at the primary display resolution section, https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

You can kinda see that common resolutions above 2560x1440, combined is less than 2560x1440's share and even that is about 1/6th of 1080p's share.

2560x1440 (which is what I run) = 3,686,400 pixels

3440x1440 = 4,953,600 pixels

Basically you're rendering ~33% more pixels than I am. And 2.38 times as many pixels as 1080p. And basically a lot more pixels than like 92% of gamers. So it's not really a mainstream issue.

And perhaps running a 2070 at maybe a smidge above its weight class (for some games) isn't an appealing enthusiast issue. Should the 2070 have had more RAM? I dunno.

I went from a 1080 ti 11GB to a 2080 Super 8GB and yeah, on some level, the idea of having less VRAM feels irksome. I had a GeForce 970 too and year the 3.5GB/500MB fast/slow RAM felt irksome too. But in either case I wasn't running at high enough resolution or demanding enough games (I guess) where VRAM was an issue, so it felt more like an edge case problem. Something people could complain about but would rarely be affected.

And arguably the 2070's RAM amount would be sufficient for a majority of gamers. I didn't have RAM issues on my 2080 Super at 2560x1440 and maybe that's the line, 80% of people are running a resolution lower than that, a significant majority. But again, should the 2070 have had more RAM? I dunno for sure, and my experience presents some biases.
Well 8GB is pushing the limits for some games like Forza horizon 5 and flight sim 2020 for me, and I'm not even trying to run both games at max graphics settings...

FH5 has yet to receive DLSS although it's recently rumored to be due to Nvidia rtx 4000 marketing slide, while msfs has just received DLSS addition...

In fh5 I tried to use the current in game res scaling, fps definitely raised, same with msfs with DLSS, however, I can't draw my own conclusion with the VRAM usage for both games, that's the thing, measuring VRAM usage is tricky because they fluctuates during applications usage... And I'm not professional enough to actually try to measure them...

Also by that I definitely do not see that my VRAM usage were cut in any significant margin with both games, certainly not something noticeable from OSD tools, like going from 7+ GB to 5 GB or something like that...

I agree that in theory, the contemporary image upscaling and reconstruction tech in PC gaming world should reduce the VRAM usage of said games, I just want to know if there's conclusive evidence for that, from the both journalists and PC gaming communities...
Last edited by Napoleonic S; Sep 29, 2022 @ 8:34pm
xSOSxHawkens Sep 30, 2022 @ 1:05am 
So I cannot speak for DLSS, but FSR does indeed reduce VRAM use when in use, with the more aggresive modes (performance) showing the highest trim on VRAM versus the higher quality modes which only trim a bit.

I have used FSR scaling to keep certain games within the 2GB VRAM limit on a number of older laptops.
Soulreaver Sep 30, 2022 @ 1:12am 
If it does reduce VRam seems to be not noteworthy.

For example if you get into the 4k texture range it still uses the 4k texture even with Ultra Performance and costs atleast an almost equal amount of vram. Never tested it in Detail but I had instances where a 2mil pixelcount by DSR 3x and DLSS Ultra Performance (equal to 1080p) maxed out my 11GB VRam.

Thats all I know yet.
Last edited by Soulreaver; Sep 30, 2022 @ 1:15am
Napoleonic S Oct 2, 2022 @ 7:51pm 
Originally posted by Soulreaver:
If it does reduce VRam seems to be not noteworthy.

For example if you get into the 4k texture range it still uses the 4k texture even with Ultra Performance and costs atleast an almost equal amount of vram. Never tested it in Detail but I had instances where a 2mil pixelcount by DSR 3x and DLSS Ultra Performance (equal to 1080p) maxed out my 11GB VRam.

Thats all I know yet.
Weird isn't it? Now if only we can raise awareness for this "issue" to the mainstream media... So that they can do proper test and even explanation on what actually happens between DLSS and VRAM usage.
Cathulhu Oct 3, 2022 @ 12:34am 
What "issue"?
There is no issue and DLSS was never meant to be viable to reduce VRAM usage.
Komarimaru Oct 3, 2022 @ 2:09am 
Curiously, are you confusing a game allocating VRAM to actual usage? You can get RTSS through afterburner to show true Memory usage by enabling the GPU.DLL plugin(The three dots above monitoring cboices), then you enable monitoring for Allocated vs Used.
Mittens Oct 3, 2022 @ 4:55am 
Rendering in an internal lower resolution only reduces the screen buffers, G buffers and such, but the vast majority of VRAM usage is from textures and meshes. And textures are not dependant on rendering resolution.
So VRAM won't drop much by using DLSS and other similar techniques.

2560 * 1440 * 4 = 14,7 MB. This is for a 32 bit ARGB screen buffer.
1920 * 1080 * 4 = 8,3 MB.
So a mere 6,4 MB reduction for a single screen buffer.
Games often uses double/triple buffering, a Z buffer and several G buffers - but this will still only result in maybe 100 MB reduction or so, if not less. Depends on the game engine.

This is simplified a bit, because you also typically need other resolution-dependant buffers, for stuff like storing temporary data when doing the post process effect chain and such. But yeah.. The VRAM reduction is not significant at all.
Last edited by Mittens; Oct 3, 2022 @ 5:02am
Napoleonic S Oct 3, 2022 @ 6:07am 
Originally posted by Mittens:
Rendering in an internal lower resolution only reduces the screen buffers, G buffers and such, but the vast majority of VRAM usage is from textures and meshes. And textures are not dependant on rendering resolution.
So VRAM won't drop much by using DLSS and other similar techniques.

2560 * 1440 * 4 = 14,7 MB. This is for a 32 bit ARGB screen buffer.
1920 * 1080 * 4 = 8,3 MB.
So a mere 6,4 MB reduction for a single screen buffer.
Games often uses double/triple buffering, a Z buffer and several G buffers - but this will still only result in maybe 100 MB reduction or so, if not less. Depends on the game engine.

This is simplified a bit, because you also typically need other resolution-dependant buffers, for stuff like storing temporary data when doing the post process effect chain and such. But yeah.. The VRAM reduction is not significant at all.
Thanks for the explanation, but so then why GPU reviewers have shown us for decades now that increasing the rendering resolution from sub HD res to 4K or even 8K would result in significant increase of VRAM usage?
A&A Oct 3, 2022 @ 6:41am 
Originally posted by Mittens:
2560 * 1440 * 4 = 14,7 MB. This is for a 32 bit ARGB screen buffer.
1920 * 1080 * 4 = 8,3 MB.
So a mere 6,4 MB reduction for a single screen buffer.
What do you mean by * 4?

Originally posted by Napoleonic S:
why GPU reviewers have shown us for decades now that increasing the rendering resolution from sub HD res to 4K or even 8K would result in significant increase of VRAM usage?
It is not that big. Without anti-aliasing from 1080P to 1440P is something like 100-200MB. The problem is when you are using anti-aliasings like: TXAA; SSAA (it is not a big problem in higher resolutions); MSAA, TAA (Not very expensive but still)
Last edited by A&A; Oct 3, 2022 @ 7:07am
Mittens Oct 3, 2022 @ 8:06am 
32 bit / 8 = 4 bytes.
< >
Showing 1-15 of 22 comments
Per page: 1530 50

Date Posted: Sep 29, 2022 @ 8:02am
Posts: 22