Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The only reason it wouldn't be or if you wanted it lower, then change your graphic settings to low and it won't have to work as hard.
But not recommended unless you're having fps issues
The game isnt demanding enough
or theres a CPU or other kind of bottleneck
If your CPU usage is very high thats a probable cause
But as previously stated, its normal for a GPU's usage to be 99/100% in demanding games
GPU usage as is is a useless measurement and will usually point towards 90% on any game anyway. This is because the "100%" is "what is possible at the current power state", and the GPU changes its power state in accordance to how much is actually required of it.
But this game, it can max out a 2080ti and my current 6950xt, which is by all means a card that can play many games at 4k120.
Here, do this experiment: Use MSI Afterburner to limit your card to like 60% power usage. Somehow Middle Earth will still run fine even though now it uses 60% of the power it used before.
I think you should do more research on how things actually work
There is Nothing wrong with a GPU having high power draw on one game but not on another, that isnt how things work.
USAGE is exactly that, indicating if the GPU is being properly utilized or not. Things that can affect it being, a CPU bottleneck, Vsync, Graphic settings, and how powerful the GPU is as a whole (which you basically mentioned) compared to how graphical/demanding the game is.
Fact of the matter is, SOW is more graphic and demanding than The Dwarves especially with the High Res pack installed.
All in all, i think you are seeing issues with regard to how hardware is utilized from one game to another without fully understanding the differences. There is nothing wrong with what you're describing.
And considering the max power usage of a 2080 Ti is 260W, its not uncommon for it to be able to draw around 300w depending on power target and any overclocking AND the game/benchmark used.
If you had done a search on a 2080 Ti's power usage you would have seen that 300w again, is Not uncommon.
There is absolutely nothing wrong with the game, you are simply just not understanding the results you are seeing.
The Dwarves absolutely MELTED my 2080ti for no discernable reason and this is why I compared it to this game, because it did the same thing.
The idea was: How do games that have such wildly different graphical requirements both bring the GPU to its limit?
Utilization is one thing, but utilization also seems to depend on the power state the gpu is in: That is, how much performance is actually available. But aside from that, that isn't really what I was having a problem with. It's power usage, only.
Furthermore, I was able to make the Middle Earth game look identical, with identical frame timings by just limiting the TDP of the GPU. So that additional power was used by the engine for no discernible reasons.
That's why I want to formulate a tip: If you have a very high end GPU from the 2020s, try limiting the power draw of the GPU in MSI Afterburner and see if you still get identical performance.
And yes, I completely get that a 2080ti has 260w of TDP. I was just surprised that this game would be able to completely utilize its available power, hell, that Shadow of Mordor, the predecessor, would be able to. I have no idea how this (proprietary) engine works, but it seems to do some calculations that work almost like a power virus. Maybe they do async shaders that run asynchronously from the framerate at maximum performance possible or something.
I've just installed Shadow of War again, but to be honest, I actually haven't experienced the same thing this time. It behaves, using 130w at 4k everything on ultra. I, by the way, always limit the FPS in these kind of cinematic games, because it makes no sense to me with the current energy prices (and how my tower manages to heat my room) to go all out here. Limited to 120fps (my TVs maximum hz) the game uses like 260w on my 6950XT that can max out at over 300w of power usage. But limited to 60 fps, it goes to like 120w.
60 fps is way enough for a cinematic game like this imo. Also I'm not sveltering.
I also have at least one more person who has observed the same thing happening, in their case they realized their gpu would have coil whine in this game, which brought them to the power usage. Limiting the max power of the GPU didn't lose them any FPS (neither average, nor 1%, frametimes stillt he same) either.
Games which run in borderless windowed mode will run at the same resolution and refresh rate of the desktop, meaning they'll get an implicit VSync and be capped at the refresh rate, instead of being uncapped by default the way fullscreen exclusive mode usually works.
Correct. There is rarely any benefit to running the GPU at a framerate greater than your monitor's refresh rate.