Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
My gut tells me that since Nvidia hard rides CP2077's coattails to sell tons of hardware and they've made massive changes to their proprietary tech for the worse in this game, theyre most likely ready to hype up brand new shiny GPUs that solve it all, yet again.
RT is awful in most games, at worst kills performance and has a ♥♥♥♥♥♥ bloom effect, at best some marginal graphical enhancements that in any action orientated game you're not very likely to ever actually notice. Now if play games to be a walking sim and to just gawk at graphics, sure bet it looks cool.
For actual gamers, I really don't understand what the fuss is all about.
For what worth, a lot of this ish' is marketing nonsense like so many other flashy gizmos and fly by night features GPU manufacturers put out to entice potential buyers. Ray tracing isn't some new magical thing, concept in origin goes back to the 16th century and has been used in film for a while too.
"For decades, global illumination in major films using computer-generated imagery was faked with additional lights. Ray tracing-based rendering eventually changed that by enabling physically-based light transport. Early feature films rendered entirely using path tracing include Monster House (2006), Cloudy with a Chance of Meatballs (2009),[14] and Monsters University (2013).[15]"
https://en.wikipedia.org/wiki/Ray_tracing_(graphics)
It's a tool for devs to use, but what we've got now is a marketing buzzword being shoe horned into computing environments that still struggle to run it at any reasonable framerate. This is nothing new, and has nothing to do with the last patch.
Anyway, just because you can't get good perf with rt, is not a reason to get all salty about. It's also not a twitch shooter where you need 100500 fps, it was designed to play fine on consoles at 30 fps after all.
Either way, CP77 looks great with or without rt, and has a bunch of settings that scales well for anyone to have the experience they want out of it.
I think part of the issue is that at Medium/partial settings (and without an HDR monitor) RT is useless.
Sure, if you can push everything to max + path tracing + have a great monitor, it might be great. But not everyone has that.
(I've got a basic/1080p monitor and an RX 6650xt. I can turn on a couple pieces of RT at medium, with decent performance. But I need to take on/off screenshots and swap between them several times in order to try to see what the difference is.)
rt is a new tech for gpus, and to get decent perf/quality you need cutting edge gpu right now, yes
but that has always been the case for graphics, people simply started to forget during xbox360/xb1 generation where there's been barely any advance for decades (rare exceptions on pc do not count); on the other hand, the tech is catching up fast, lower-mid tier nvidia gpus already on par with the first get rt hw, and now that all thre major vendors are invested, it'll progress even faster
DLSS (and the other upscaler options offered) improve performance. It will not worsen performance unless you have some other issue taking place. There is an issue sometimes in the game where when changing settings repeatedly, perf will take a hit and you'll need to save and reload before it returns to normal. But DLSS should never worsen performance outside of something like that. It literally exists to provide better performance than native while by utilizing a lower internal resolution then using AI to upscale the image.
DLAA hits performance harder than DLSS, as it uses the same sort of AI upscaling technique, but without the reduced internal resolution. This means it can provide high quality antialiasing and image quality in general, but at a performance cost.
Frame generation, on the GPUs it's available for, in theory improves framerate and motion clarity when upscaling tech like DLSS, FSR, etc. are enabled. Although your mileage may vary depending on your hardware, and even things like your variable refresh rate display support.
If you do encounter performance hits that don't make sense given the above realities, you may have some other issue, and may benefit from saving and restarting. But normally speaking, enabling DLSS vs native should always confer better, not worse, performance.
But i would rather increase resolution than using RT/PT (except photos). I see a big difference with upscale or not (shimmering).