Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You're right though, in most games I get better FPS with Dx11, but DRG is the exception.
DX12 utilizes the CPU better then DX11 in deep rock you mean?
I always thought it was the particle effects bringing down the frame rate on my end (thinking i was GPU bound, thus i never bothered to even try DX12).
Never thought changing a graphics api would change the way it utilizes the CPU too.
No wonder if you have a RTX 3080 i would expect nothing less, but not everyone has 2000+ euros laying about for a GPU these days...
Did you try playing a couple of missions to let DX12 render all the shaders? And it smoothing out the spikes? DX12 was a mess whit stuttering all over the place when i swapped to it at first too, but after playing the same biome a couple of times the stutters/spikes went away for 99% on my end.
DX12 provides most benefit in CPU bound scenarios, and DRG happens to be more CPU intensive than GPU intensive, so even on a system where the CPU is stronger than the GPU DRG will still probably be CPU limited.
It's still not perfect though, DX12 causes shader compilation stuttering after game or driver updates, and some hardware still just doesn't get along with DX12 well.
My dream scenario is for them to one day implement Vulkan as well, which should perform even better.
Vulkan implementation would be amazing yeah.
And couldn't the developers implement some kind of shader precache system? Were it loads all shaders up for the first time upon game start? I remember some older titles having that too, after major updates/driver updates.
Presumably they could, yeah. It's what you're supposed to do for a proper implementation of DX12, but I've never looked around to see if GSG have commented on why the game doesn't do this.
Fun fact though, if you play DRG on Linux it does actually precache shaders because Proton handles it, resulting in no DX12 stuttering.
https://www.youtube.com/watch?v=_DVFYHIW58w
More draw calls with less CPU load means more stuff on screen with less bottlenecks.
And everything Amble said, etc...
my game started stuttering pretty bad after the latest update. Talking about as low as 40fps in the rig with low-med settings, where as before I could easily get more than 70fps average with all High settings.
Well, I decided to give the DXVK a shot (DirectX to Vulkan), replacing the game's own DX files in the install folder. BOY did things improve tremendously !
The rig now runs >80 fps nonstop with ULTRA settings, and the previously game-crashing AMD Fidelity upscaling started working just fine!
The only minor inconvenience is that you need to build up a Shader Cache, meaning the first time you witness any kind of new asset or special effect, you get these tiny, brief stutters. After that, though, things are buttery smooth!
either that or you couldn't go past the shader compilation.