Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
RTX 3060
AMD Ryzen 7 5800H 3.2GHz
Well and this is why I asked this question, I wasn't sure about that, thank you guys.
Wouldn't almost 100% GPU utilization just mean that there's no reserve left when a more demanding scene needs to be rendered? If you walk around with 95% and then a battle starts with more characters and effects, you're gonna get frame drops. If the GPU chills at 70%, there's still enough capacity?
Of course, you wouldn't want a CPU bottleneck where the GPU is at 70% but you can't get 60fps because the CPU is at its limit. But how would >90% GPU utilization most of the time be a desirable metric? Just means that for the next releases, you're soon gonna have to dial down the settings to prevent your GPU from capping out?
I didn't mean that a GPU uses free resources to render things you might need later, now.