Age of Wonders 4

Age of Wonders 4

View Stats:
Can I run the game with this CPU? Intel(R) Core(TM) i5-4460 CPU @ 3.20GHz
Hello! I love Age of Wonders , but my computer is a bit old...

My CPU is an Intel(R) Core(TM) i5-4460 CPU @ 3.20GHz.

Can I run it with minimal settings? or it will crash even with the minimal setup???

Thanks!!!

Edit: Graphic Card is a GeForce GTX 960
Last edited by 🐇 DriadiTa 🐇; May 2, 2023 @ 4:33am
Originally posted by Illusioneer:
Originally posted by sandman25dcsss:
Originally posted by 🐇 DriadiTa 🐇:


Hehe thanks, I hope I can play without a 10min loading screen or something like that :P
It depends mostly on hard drive and RAM, not CPU. I am going to put the game on SSD to speed it up.

Huh? That's not how it goes.

Whether you've got the game on an SSD or HDD will realistically only affect load times when launching the game and loading a save. The bottleneck here being how fast information can be read and transferred to your RAM and VRAM.

How much RAM you got isn't that important so long as you have enough (And even if you don't have enough, you can pull from your HDD/SSD if really needed although this process is far too slow to be used in games). What's more important is how fast information gets moved to and from your RAM, and that information is usually bottlenecked by your CPU which needs to calculate and sort said information.



Originally posted by 🐇 DriadiTa 🐇:
Hello! I love Age of Wonders , but my computer is a bit old...

My CPU is an Intel(R) Core(TM) i5-4460 CPU @ 3.20GHz.

Can I run it with minimal settings? or it will crash even with the minimal setup???

Thanks!!!

Edit: Graphic Card is a GeForce GTX 960


From checking comparative benchmarks, I'd say you'll probably be able to run it, but may have some FPS trouble that can't be fixed by just lowing the graphics. It could be that there's a lot of CPU heavy settings that you can disable but that depends on what the dev's have implemented in the options menu.
< >
Showing 16-25 of 25 comments
Illusioneer May 2, 2023 @ 5:32am 
Originally posted by sandman25dcsss:
Originally posted by Illusioneer:

I'm 90% confident in what I stated.

Games tend to calculate A LOT of information at once and it's not at all uncommon for modern games to cap out older or even more modern CPU's.

When a CPU gets capped, windows does not suddenly stop to function correctly, every just becomes slower as information is being queued for processing rather than being done all at once.

I'm a 3d artist and am well versed in what happens when either your RAM or CPU gets capped which happens more often to me than I'd like to admit.
The mininum CPU from Steam shop page is from 2011.
I am software developer with corresponding higher education, I know a bit about it too.

I can then understand why you think this way. Code processing alone very rarely caps a CPU so this is likely a case of not seeing this first hand in your work. 3d simulations however can very quickly bring even a top of the line pc to it's knees and one of the first bit of hardware that tends to cap is indeed the CPU.

I can go into the reasons for this but it would be massively off-topic.
Last edited by Illusioneer; May 2, 2023 @ 5:32am
sandman25dcsss May 2, 2023 @ 5:34am 
Originally posted by Illusioneer:
Originally posted by sandman25dcsss:
The mininum CPU from Steam shop page is from 2011.
I am software developer with corresponding higher education, I know a bit about it too.

I can then understand why you think this way. Code processing alone very rarely caps a CPU so this is likely a case of not seeing this first hand in your work. 3d simulations however can very quickly bring even a top of the line pc to it's knees.

I can go into the reasons for this but it would be massively off-topic.
My understanding is that most of that work is done by GPU, not CPU.
I have experience of running aow3 on laptop with excellent CPU, but integrated videocard and then on PC with similar CPU, but dedicated GPU. The difference was huge.
Last edited by sandman25dcsss; May 2, 2023 @ 5:36am
I'm learning a lot with your posts lol
Big Chewy™ May 2, 2023 @ 5:40am 
Originally posted by 🐇 DriadiTa 🐇:
Hello! I love Age of Wonders , but my computer is a bit old...

My CPU is an Intel(R) Core(TM) i5-4460 CPU @ 3.20GHz.

Can I run it with minimal settings? or it will crash even with the minimal setup???

Thanks!!!

Edit: Graphic Card is a GeForce GTX 960

without breaking a sweat
Illusioneer May 2, 2023 @ 5:44am 
Originally posted by sandman25dcsss:
Originally posted by Illusioneer:

I can then understand why you think this way. Code processing alone very rarely caps a CPU so this is likely a case of not seeing this first hand in your work. 3d simulations however can very quickly bring even a top of the line pc to it's knees.

I can go into the reasons for this but it would be massively off-topic.
My understanding is that most of that work is done by GPU, not CPU.

The GPU does all the heavy lifting of working out how things are to be displayed on the monitor. In the case of real-time games the GPU calculates lightning, reflections, shading, ect.

The CPU however calculates the positions and movement of polygons (Millions of them) and any other bits of animation that happens. this is what tends to drag on a CPU during gameplay. The reason why is because the CPU doesn't describe only the beginning point and the end point, it has calculate the position of all the in-between frames based on a fixed time which as you should know introduces a lot of complexity into the equation even at a front-end code level.

The CPU can also be used to calculate certain graphical elements that's ordinarily done by the GPU to balance the processing power between the two units, this all depends on how the devs made their game engine. Most none-real-time renderers don't actually use the GPU beyond just displaying stuff on the monitor. The CPU can usually calculate more things at once than a GPU and it's therefore preferred for things such as calculating light rays. Most Disney films are rendered on a CPU, not a GPU.

Additional Edit:
Whether a game needs a better CPU or a better GPU depends on the content of the game. Arena shooters like CoD relies on a good GPU rather than a CPU because while a lot of post-process rendering is being done not a lot of stuff is moving around and changing.

Strategy games like Age of Wonders where there are dozens of animated models in the scene at once, that's a different story.

Additional Edit in case of curiosity:
In the case of real-time video games;

The role of VRAM on a GPU is to store texture information (In the form of images) that gets called on by the GPU as the player turns to face the object that has the textures applied to it.

The role of RAM is to store variable information such as the locations of things, numbers, lots of numbers, and just in general the "Right now" state of things. In case of none-real-time rendering engines, this also means texture information and this is why 3d Artists need a lot of RAM.
Last edited by Illusioneer; May 2, 2023 @ 5:55am
sandman25dcsss May 2, 2023 @ 5:53am 
Originally posted by Illusioneer:
Originally posted by sandman25dcsss:
My understanding is that most of that work is done by GPU, not CPU.

The GPU does all the heavy lifting of working out how things are to be displayed on the monitor. In the case of real-time games the GPU calculates lightning, reflections, shading, ect.

The CPU however calculates the positions and movement of polygons (Millions of them) and any other bits of animation that happens. this is what tends to drag on a CPU during gameplay. The reason why is because the CPU doesn't describe only the beginning point and the end point, it has calculate the position of all the in-between frames based on a fixed time which as you should know introduces a lot of complexity into the equation even at a front-end code level.

The CPU can also be used to calculate certain graphical elements that's ordinarily done by the GPU to balance the processing power between the two units, this all depends on how the devs made their game engine. Most none-real-time renderers don't actually use the GPU beyond just displaying stuff on the monitor. The CPU can usually calculate more things at once than a GPU and it's therefore preferred for things such as calculating light rays. Most Disney films are rendered on a CPU, not a GPU.
Yet here we are talking about a game which uses GPU for sure. I think you underestimate power of modern CPUs, I worked with CPU intensive projects 25+ years ago drawing images without any GPU and even then it was almost instant.
Illusioneer May 2, 2023 @ 6:02am 
Originally posted by sandman25dcsss:
Originally posted by Illusioneer:

The GPU does all the heavy lifting of working out how things are to be displayed on the monitor. In the case of real-time games the GPU calculates lightning, reflections, shading, ect.

The CPU however calculates the positions and movement of polygons (Millions of them) and any other bits of animation that happens. this is what tends to drag on a CPU during gameplay. The reason why is because the CPU doesn't describe only the beginning point and the end point, it has calculate the position of all the in-between frames based on a fixed time which as you should know introduces a lot of complexity into the equation even at a front-end code level.

The CPU can also be used to calculate certain graphical elements that's ordinarily done by the GPU to balance the processing power between the two units, this all depends on how the devs made their game engine. Most none-real-time renderers don't actually use the GPU beyond just displaying stuff on the monitor. The CPU can usually calculate more things at once than a GPU and it's therefore preferred for things such as calculating light rays. Most Disney films are rendered on a CPU, not a GPU.
Yet here we are talking about a game which uses GPU for sure. I think you underestimate power of modern CPUs, I worked with CPU intensive projects 25+ years ago drawing images without any GPU and even then it was almost instant.

I assure you I am not underestimating the power of CPU's. While working, I run monitoring software in the background to measure my RAM and CPU usage so I know to control things as not to slow my pc down (Or crash it)

I know that my i7 caps very often when working with animations and physics simulations.

I know that my 32gb of RAM caps very often when I bake said simulations or animations or have too many large texture files being previewed.

I also know my CPU caps when playing SQUAD on the highest settings.

You can measure this yourself as well, I don't know how accurate it is but Windows has a built-in resource monitor.

In terms of AoW4, yeah ofc it's making heavy use of people's GPUs but it's completely possible that it's hogging a lot of CPU power as well. The devs would not have set the requirement as such without reason. If the game can run on a fridge, why are the devs listing an i7 as minimum?

EDIT:

Assigning some colors to a few pixels is one thing, calculating thousands of frames worth of animations applied to millions of polygons to conform to real-time all at the same time is a very different ball game.
Last edited by Illusioneer; May 2, 2023 @ 6:08am
sandman25dcsss May 2, 2023 @ 6:09am 
Devs say processor generation is more important than if it is i5 or i7, but basically you brought me some new info so maybe I am wrong indeed. Thank you for the discussion and detailed description, it was quite useful for me.
Last edited by sandman25dcsss; May 2, 2023 @ 6:11am
Illusioneer May 2, 2023 @ 6:15am 
Originally posted by sandman25dcsss:
Devs say processor generation is more important than if it is i5 or i7, but basically you brought me some new info so maybe I am wrong indeed. Thank you for the discussion and derailed description, it was quite useful for me.

I wouldn't say you're wrong, just misinformed due to a lack of understanding on how taxing 3d graphics really are. But I wouldn't expect a programmer to know the ins and outs of 3d rendering in much the same way as I hope a programmer wouldn't expect me to know the ins and outs of coding.

I dunno if you'd be interested in this, but I can give a demonstration of exactly what I'm saying if you'd like through a discord call.
sandman25dcsss May 2, 2023 @ 6:18am 
Thank you for the kind words and the offer, but I feel I have already made you spend too much time on me ;)
I think I understand what you are talking about, I didn't expect CPU is used that much for 3d rendering.
< >
Showing 16-25 of 25 comments
Per page: 1530 50

Date Posted: May 2, 2023 @ 4:25am
Posts: 25