Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I've seen issues in earlier versions of the game where the GPU would be underclocking because performance would be cut in half whenever entering combat, because of a bug with target analysis mod for kiroshis along with passive effects from quickhacks like legendary grade crippling movement.
I mean how should the GPU handle it anyway else.
Usualy it would adjust for it in higher fps and if it can't adjust by higher fps because of a limit it would adjust by reducing clockspeeds. But 500Mhz Clock jumps are probably very stressfull for the GPU hence the crash.
Haven't done further testing yet but I understand it's a very niche problem.
Within any games, set their fps limit to unlimited and disable their VSync. Unless it is a Rockstar game, where the game is bugged to where the game might act up with the in-game VSync off, which is often the case with Rockstar games where cut-scenes are involved.
In nvidia control panel...
Max fps = set to what you want for games. Like 60 for a 60Hz Display. If you have 144Hz Display but can't get a steady 140+ fps all the time, set the max to 72. With some form of VSync set to Adaptive, Fast or enable GSync if you have a GSync or FreeSync enabled Display.
Background max fps = set this to 30, which should be fine as you usually won't notice.
Low Latency = set this to On. But if you use GSync, set this to Ultra.
Threaded Optimization = On
Triple Buffering = try it On vs Off and see what works best. This only applies to OpenGL/Vulkan games.
Shader Cache = On ~ However if you are doing various tests alot, benchmarks and such, turn this off. Also if a game is having crashing issues, this could be part of the problem because when a game crashes in short amount of time, it could be as simple as a corrupted gpu cache that is on-disk. This usually gets stored in some nvidis folder with the AppData structure. Also since Nvidia has this option, it would be a good idea to disable the shader cache option in Steam Client.
I don't think this solution fits my problem.
Answer: Power management mode set to prefer maximum performance.
I am not sure if this is what you want. You don't want the clocks to fluctuate so much correct? You want your GPU to be operating at its highest boost clock whenever possible? I would toggle it "On" if the power management mode set to prefer maximum performance doesn't do what you want it to.
Personally, when gaming, especially in a game like Cyberpunk, I would want my GPU to be running at its optimal boost clock. Toggling the Boost Clock option "On" in Precision does this very thing.
only way to prevent it from idling is to turn off any form of vsync
There is no differences there and having RivaTuner do it has been rendered pointless ever since NVIDIA added such a feature to the NVCP.
But yes, the NVIDIA CP just like the AMD Radeon software; I can honestly say, the defaults suk. If you care about gaming even just a tiny bit, you want to change those settings. Power Management should always be set to Prefer Max Performance. If you allow games to use any other power mode, it can cause all sorts of stutter issues when the GPU power and/or clocks has a drastic change that happens very quickly. Especially if the GPU happens to keep jumping between various voltages and clocks in large differences in a constant up, down, up down; type of situation.
If for some reason your GPU is drawing more Watts then it really needs to while you are using "Prefer Max Performance" then simply bring up MSI Afterburner and lower the Power Limit %
It's really that simple.
I think I love you now :D
Back when I first started reading about performance mode I always read stuff akin to
"It helps the system to better use its ressources" why I never paid attention to it.
That it holds boostclocks is a gamechanger. xD
Thanks man.
I know Riva produces flatline timings but for me Control Panel is fine too.
I'm not too sensitive to timings.
The adaptive or power saving modes are fine for non-gaming stuff. But for games where your FPS and overall smoothness of gameplay experience matters, or when doing 3d rendering, benchmarks, or video encoding; you will want NVIDIA CP set to Prefer Max Performance.
Again you can do that on the Global tab and just adjust over time as you see fit. Or configure it on an app by app basis on the Program tab.
Yeah that's why I said it makes total sense to me that it behaves that way.
I mean it still does that just instead of using clocks and wattages it now uses only wattages.
Might be that RT is better. I don't use the Riva Overlay so I don't see my timings. I'm pragmatic. If it feels smooth I'm fine. :D
One day I'll probably research the ♥♥♥♥ out of timings and how they exactly work but not in the near future.