Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
it takes a lot of resources too simulate, and generate the cities.
Also its early Access.
Correct me if im wrong.
I thought GPU usage is mainly determined by what is currenly on the screen, and CPU usage is determined by the things you are describing: generating levels, things happening/simulating in the background etc.
Edit: typo
It was true in 2010, but since many years GPU have cuda cores, tensor cores, they have not only graphical purpose. It's faster for the GPU to simulate some aspects of the AI than the CPU. (Stable Diffusion uses the Vram and Cuda cores of the GPU for example). For a game that needs to manage a city with hundreds of NPC in realtime, the GPU is the new king.
To be honest, i'm quite sure that the CPU would not handle as much parameters as the GPU.
The most likely reason is that, while you're capped at 60, you probably don't even reach 60 most of the time or barely. The game just isn't optimized yet.
Also the fans and the temperature don't really matter. Your components are built to perform at 100% load for two decades without fail and if you've done your due diligence with building a PC, then you should never thermal throttle before what's on the spec sheet. Unless you have a laptop, in which case manufacturers of laptops happily lie about it.
This game uses raycast lighting and tracks several hundred NPC's at all times, including their current action, held items, and pathing. This is going to be an inherently hard game to run, especially on older cards with bad support for raycasting.
A CPU is very good at several simultaneous processes, while a GPU is the opposite, generally.
Civs and their days, I imagine, are simulated on the CPU, whereas all of the graphics and rendering is the GPU. both are bound to be VERY taxing in this game. But the CPU would be completely inefficient at what the GPU does, and vice versa.
Let me use an example to demonstrate this:
Minecraft java edition is rather easy to run, even on integrated graphics cards; this is because the game has both a very efficient rendering and chunk loading system, AND because every texture in the game is only 16x16. This means for every visible face of a block, the game only needs to render a single 16x16 image as many times as it needs to, and that's not hard at all for even rudimentary GPU's to handle.\
(vanilla might even be 8x8, actually)
Most of the time, a good CPU is actually what you need in minecraft: a LARGE majority of the processing time for most people running vanilla minecraft will be chunk updates and game logic, not graphics.
This game is like that, needing a beefier CPU, only instead of 16x16 textures, there are now hundreds of variably sized textures at all angles and locations, with slightly jank culling, all being operated on by a raycast lighting system. Real-time lighting is one of the most intense things a GPU can process, and this game uses a system that kind of works like raytracing's little brother.
On top of that, there are thousands of assets and large amounts of civ information being streamed to / from RAM, several hundred processes (assumedly) happening on the CPU, all while the game individually casts a few thousand rays of light in all directions. Every single frame.
Do you know a way for me to reduce GPU usage? Except for waiting for an optimization update.
Performance issues after that is most likely either thermal throttling (which shouldn't be the case on anything except laptops) or your GPU simply not being strong enough. The latter is very likely the case, due to the indev nature of the game.
As an aside, if other modern games aren't using all of your GPU you may want to figure out why. You're not running at peak performance in those games if you're only getting 70% usage (or, god forbid, 40%)
...unless you are on a laptop, and may benefit from not running at 100% simply because of thermal issues.