Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
i trust you, your name makes me trust you on this matter.
you have thought wrong. its merely this way:
where companies further were aiming at 30 FPS without scaling, they aim now for 30 FPS with DLSS/FSR. you didnt really think, devs would do more, as they should, do you?
de nada
On consoles maybe sure... But if you are already getting 60fps native without DLSS then you turn on DLSS on quality mode you get more fps and the image quality is basically the same, whats not to like...?
Or if your PC is a little older/weaker and you are struggling to hit 60fps, again turn on DLSS and now you might get 60fps - How is that a bad thing...?
Its not going away anytime soon - so why not embrace it...?
and thats again not correct. DLSS means scaling. Im a graphics designer and I learned very quick, that any form of upscaling looks worse, than native resolution. maybe Im too old to understand, that DLSS upscaling enhances the image, so I start thinking "why dont they do this with native resolution to make it look crisper?". in my opinion its the same dream bubble as back in the day, when they were trying to sell us 30 FPS as a better solution, a "cinematic experience". in the end, its all advertisement.
Upscaling / downscaling is NOT more detail and never will be. Anyone in graphics design or graphics programming understands this. What it allows companies to do is market that their card can do 4K at good FPS using filtering techniques. But if you read between the lines what it means is our card cannot handle native 4K so we use gimmicks to make it so. Then our marketing department uses advertisement speak and gas lighting to sell it as a better feature than native 4K. It isn't. But it does allow your card that cannot handle a certain resolution to handle it via 'tricks'. But it is not better.
It does allow cards to handle resolutions they might not be able to b/c ...they can't handle the native resolution...so they change parts of the frame buffer so it isn't pure native resolution. If DLSS changes texels off in the distance you probably won't notice it at all. But if you use DLSS on any kind of text display that is in front of you such as those on Cyberpunk you will be unable to read them b/c it is filtering and blurring. Is it a really big deal? Nope. Use it if it helps and you can handle the change in image quality. My eyes cannot handle it. I see portions of the frame buffer popping into and out of various levels of detail and pixel resolution and it bothers me.
But people think it gives better FPS. It does but not b/c of better detail. It does it b/c of less detail but less detail in strategic portions of the frame buffer. Most of the time people might not notice it at all. It also depends on how the game uses it and so on. I find it quite jarring in most games and cannot use it. Of course I have AMD so DLSS isn't a thing but I also can't use FSR. I try to turn off all the frame gimmicks and just run it as is. That includes motion blur and depth of field.
I never said looks better or even the same - I said you can hardly tell the difference, or at least i cant and thats using DLSS 2.0 on quality mode... And i know what DLSS does lol... Perhaps re-read what i actually wrote before replying again...
sorry, I didnt mean you dont understand. I just catched the phrase and quickly responded tagging you, but I had different things in mind.
P.S. Yeah better go back play GTA V for another 1.2k hours...
Anyway, regarding DLSS ability to produce results more detailed than native image. It absolutely can and it does it frequently. I've literally provided a real example couple of posts ago.
First off, they do. It is called DLAA (for whatever reason). Which is literally DLSS rendering at native resolution internally. And secondly there are a bunch of considerations based on the temporal nature of DLSS where quality increases with FPS making it a balancing act. In fact, there are certain cases where paradoxically you can get a decrease in overall quality despite higher internal resolution depending on the frame's raw computational complexity. Not to mention that higher performance is simply always desirable.
It doesn't matter what you can see. It matters what others can see. I never pretend I can look thru another person's eyes. When I was young, my vision was 20/10, as tested by a doctor. I could and do enjoy playing any video games in the 30 - 60 fps just fine. 60hz monitor is good enough for me.
DLSS in The Witcher 3 Next Gen update, made a very noticeable improvement in visuals in that game. Esp set to Quality and using RT. The low fps, ( spikes of below 30 fps ) where less. So fps in the 50 - 60 fps ( I had it capped at 60 fps ) where clearly much more consistent.