Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
But sometimes when you're asking a question that no one has bothered to answer it raises questions about the value of the question. At least beyond the random curiosity of the thing.
After if it doesn't lower VRAM usage, so what? DLSS probably isn't a way to say ignore minimum requirements for a game. And if it does, so what? You already had enough to run the game without DLSS.
After all the specifics would depend on the game AND what level of DLSS you're using. So there wouldn't be a concrete answer. And the answer may end up being it uses less, except when it doesn't.
If you have a DLSS enabled card, what's stopped you from testing a few games to see if you can see obvious differences that would make it an interesting enough subject to pursue further? It may just not be an interesting enough subject for most content producers.
DLSS exists to make it less computing intensive to render an image.
With Nvidia being stingy about VRAM on their high end GPUs, I thought this would've been more popular of a topic...
You can kinda see that common resolutions above 2560x1440, combined is less than 2560x1440's share and even that is about 1/6th of 1080p's share.
2560x1440 (which is what I run) = 3,686,400 pixels
3440x1440 = 4,953,600 pixels
Basically you're rendering ~33% more pixels than I am. And 2.38 times as many pixels as 1080p. And basically a lot more pixels than like 92% of gamers. So it's not really a mainstream issue.
And perhaps running a 2070 at maybe a smidge above its weight class (for some games) isn't an appealing enthusiast issue. Should the 2070 have had more RAM? I dunno.
I went from a 1080 ti 11GB to a 2080 Super 8GB and yeah, on some level, the idea of having less VRAM feels irksome. I had a GeForce 970 too and year the 3.5GB/500MB fast/slow RAM felt irksome too. But in either case I wasn't running at high enough resolution or demanding enough games (I guess) where VRAM was an issue, so it felt more like an edge case problem. Something people could complain about but would rarely be affected.
And arguably the 2070's RAM amount would be sufficient for a majority of gamers. I didn't have RAM issues on my 2080 Super at 2560x1440 and maybe that's the line, 80% of people are running a resolution lower than that, a significant majority. But again, should the 2070 have had more RAM? I dunno for sure, and my experience presents some biases.
FH5 has yet to receive DLSS although it's recently rumored to be due to Nvidia rtx 4000 marketing slide, while msfs has just received DLSS addition...
In fh5 I tried to use the current in game res scaling, fps definitely raised, same with msfs with DLSS, however, I can't draw my own conclusion with the VRAM usage for both games, that's the thing, measuring VRAM usage is tricky because they fluctuates during applications usage... And I'm not professional enough to actually try to measure them...
Also by that I definitely do not see that my VRAM usage were cut in any significant margin with both games, certainly not something noticeable from OSD tools, like going from 7+ GB to 5 GB or something like that...
I agree that in theory, the contemporary image upscaling and reconstruction tech in PC gaming world should reduce the VRAM usage of said games, I just want to know if there's conclusive evidence for that, from the both journalists and PC gaming communities...
I have used FSR scaling to keep certain games within the 2GB VRAM limit on a number of older laptops.
For example if you get into the 4k texture range it still uses the 4k texture even with Ultra Performance and costs atleast an almost equal amount of vram. Never tested it in Detail but I had instances where a 2mil pixelcount by DSR 3x and DLSS Ultra Performance (equal to 1080p) maxed out my 11GB VRam.
Thats all I know yet.
There is no issue and DLSS was never meant to be viable to reduce VRAM usage.
Full article: https://www.tweaktown.com/articles/9532/death-stranding-benchmarked-at-8k-dlss-gpu-cheat-codes/index.html
Note: DLSS doesn't effect the textures'resolution!
So VRAM won't drop much by using DLSS and other similar techniques.
2560 * 1440 * 4 = 14,7 MB. This is for a 32 bit ARGB screen buffer.
1920 * 1080 * 4 = 8,3 MB.
So a mere 6,4 MB reduction for a single screen buffer.
Games often uses double/triple buffering, a Z buffer and several G buffers - but this will still only result in maybe 100 MB reduction or so, if not less. Depends on the game engine.
This is simplified a bit, because you also typically need other resolution-dependant buffers, for stuff like storing temporary data when doing the post process effect chain and such. But yeah.. The VRAM reduction is not significant at all.
It is not that big. Without anti-aliasing from 1080P to 1440P is something like 100-200MB. The problem is when you are using anti-aliasings like: TXAA; SSAA (it is not a big problem in higher resolutions); MSAA, TAA (Not very expensive but still)