Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
So basically it's a potentially useful technique for playing at 120-240Hz, although it can be combined with DLSS 2 to increase the base fps, which isn't available either in CS2. On top of that DLSS 3 also requires a very significant amount of VRAM, not good here.
What would be great is DLSS 2. That's way better than regular upscaling or FSR1 if you have a nvidia gpu.
Same.
I've used DLSS at 40-60 fps, no issues. It just means that you have the latency of about half the fps. Latency always increases the lower the fps are. The input device and display can add more latency than DLSS itself.
With regards to VRAM, DLSS might require some but you also gain some VRAM by rendering at a smaller internal resolution. The question is how well the game is optimized to use lower resolution textures in that case.
I think you're confusing DLSS2 with DLSS3. With DLSS3 you're not rendering at a lower internal resolution. That's DLSS2. And also that resolution has nothing to do with the texture resolution used by the game. You might be rendering internally at 400p or lower and still be using 4k texture resolution.
Having a buffer does not increase the latency. The frametime for Interpolation is not comparable to the frametime of rendering. Which makes it a negligible factor in contrast to fps.
You're confusing DLSS versioning with upscaling vs. frame generation. This game has neither. Enabling both might lead to lower VRAM cost.
DLSS2 is AI powered upscaling and DLSS3 is frame generation. The first doesn't have significant extra latency and can reduce vram consumption because you're rendering at a lower resolution. The second increases latency and graphical artifacts, specially at lower framerates AND considerably increases vram consumption.
You're losing one half frame time if you try to distribute the frames homogenously. Theoretically, you have two frames available at the same moment the next frame is available without frame gen. But the second frame will be delayed.
DLSS version 3.x includes both frame generation and upscaling. You're confusing the versions with the respective features they introduced. DLSS 2.x only offers upscaling. Sure, you can only enable upscaling and benefit from the lower VRAM without enabling frame gen. But this neglects that VRAM isn't its own end. Having less VRAM will limit fps somewhat, but frame gen will provide double the fps. It's highly unlikely that frame gen requires so much VRAM that the internal fps are cut in half.
You were right. I understood DLSS3 only as the frame generation part.
When I looked time ago into frame generation (from NVIDIA and AMD) the VRAM consumption from FG alone was quite significant and the input lag increase very noticeable at low base FPS in a couple games (Aveum and Forspoken).
So essentially the less base frames the worse frame generation is, and then it competes for vram for other usages, like custom assets and mods once they become available.