Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I can then understand why you think this way. Code processing alone very rarely caps a CPU so this is likely a case of not seeing this first hand in your work. 3d simulations however can very quickly bring even a top of the line pc to it's knees and one of the first bit of hardware that tends to cap is indeed the CPU.
I can go into the reasons for this but it would be massively off-topic.
I have experience of running aow3 on laptop with excellent CPU, but integrated videocard and then on PC with similar CPU, but dedicated GPU. The difference was huge.
without breaking a sweat
The GPU does all the heavy lifting of working out how things are to be displayed on the monitor. In the case of real-time games the GPU calculates lightning, reflections, shading, ect.
The CPU however calculates the positions and movement of polygons (Millions of them) and any other bits of animation that happens. this is what tends to drag on a CPU during gameplay. The reason why is because the CPU doesn't describe only the beginning point and the end point, it has calculate the position of all the in-between frames based on a fixed time which as you should know introduces a lot of complexity into the equation even at a front-end code level.
The CPU can also be used to calculate certain graphical elements that's ordinarily done by the GPU to balance the processing power between the two units, this all depends on how the devs made their game engine. Most none-real-time renderers don't actually use the GPU beyond just displaying stuff on the monitor. The CPU can usually calculate more things at once than a GPU and it's therefore preferred for things such as calculating light rays. Most Disney films are rendered on a CPU, not a GPU.
Additional Edit:
Whether a game needs a better CPU or a better GPU depends on the content of the game. Arena shooters like CoD relies on a good GPU rather than a CPU because while a lot of post-process rendering is being done not a lot of stuff is moving around and changing.
Strategy games like Age of Wonders where there are dozens of animated models in the scene at once, that's a different story.
Additional Edit in case of curiosity:
In the case of real-time video games;
The role of VRAM on a GPU is to store texture information (In the form of images) that gets called on by the GPU as the player turns to face the object that has the textures applied to it.
The role of RAM is to store variable information such as the locations of things, numbers, lots of numbers, and just in general the "Right now" state of things. In case of none-real-time rendering engines, this also means texture information and this is why 3d Artists need a lot of RAM.
I assure you I am not underestimating the power of CPU's. While working, I run monitoring software in the background to measure my RAM and CPU usage so I know to control things as not to slow my pc down (Or crash it)
I know that my i7 caps very often when working with animations and physics simulations.
I know that my 32gb of RAM caps very often when I bake said simulations or animations or have too many large texture files being previewed.
I also know my CPU caps when playing SQUAD on the highest settings.
You can measure this yourself as well, I don't know how accurate it is but Windows has a built-in resource monitor.
In terms of AoW4, yeah ofc it's making heavy use of people's GPUs but it's completely possible that it's hogging a lot of CPU power as well. The devs would not have set the requirement as such without reason. If the game can run on a fridge, why are the devs listing an i7 as minimum?
EDIT:
Assigning some colors to a few pixels is one thing, calculating thousands of frames worth of animations applied to millions of polygons to conform to real-time all at the same time is a very different ball game.
I wouldn't say you're wrong, just misinformed due to a lack of understanding on how taxing 3d graphics really are. But I wouldn't expect a programmer to know the ins and outs of 3d rendering in much the same way as I hope a programmer wouldn't expect me to know the ins and outs of coding.
I dunno if you'd be interested in this, but I can give a demonstration of exactly what I'm saying if you'd like through a discord call.
I think I understand what you are talking about, I didn't expect CPU is used that much for 3d rendering.