Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
AMD made it recently open source so there is a good chance for improvement but in the new Avatar game the FSR upscaling and frame generation is about as good as DLSS in my personal opinion.
Just like what's marketed as DLSS3, funny that.
people with a gtx 750 ti are happy with fsr...
Highend GPU's do not mean infinite performance. So if you want more FPS, or to run a game a game on a 4K screen, or just run a demanding game satisfactorily then DLSS/FSR is a nice feature and a decent trade off to achieve it. It's not just a thing for old/low end cards.
You can only shrink a node and brute force your way to performance so far before you hit a wall, and the days of doing both of those are coming to an end.
Unless there's some huge architectural advancement in how GPU cores are designed and function in the next couple generations that sees a massive uplift in raw compute, expect using upscaling (and likely some sort of frame gen) to pick up the slack.
Even if you have a GTX750Ti without FSR, let's say you get 15-20 FPS, and with frame generation, you might get 60, but it still feels like a crap. So, it depends on the games you are going to play it may be useful.
i have a highend gpu i was just curious why people care about such features
Because it looks identical and sometimes even better (because dlss is unintentionally the best antialiasing of them all) and uses less power which is particularly important during summer.
To OP: try both, see which one gives more FPS and use that.
My 3080 still uses 380+W under load with or without DLSS enabled.
CPU uses more power with DLSS enabled as it does shift more load to the CPU by forcing internal rendering at a lower resolution before handing the frame off to the GPU to be both upscaled and drawn.
I play on a TV at 4K resolution. I can only do tgat with Cyberpunk with ray tracing and all settings at ultra with DLSS. And even then i can only hit 35fps
id rather drop to 1440p or high settings or without raytracing to hit +60 fps but to each their own i guess
Thats exactly what dlss does. 4k with dlss (quality) performs about the same as 1440p and looks almost as good as real 4k. Although if the framerate is low even with dlss then he may need to use dlss (performance) setting which renders at ~1080p or maybe even less and may look blurry.