HELLDIVERS™ 2

HELLDIVERS™ 2

View Stats:
This topic has been locked
Zen Mar 24, 2024 @ 10:25am
Will they improve optimization? Its not great
I have a very good pc and I have to play on almost full low settings to barely get 60 FPS..

Also do you guys know which settings affect the green blood/bile of the terminids? Its super pexilized for me, it looks like its in 240 p
< >
Showing 16-30 of 209 comments
junglecock Mar 24, 2024 @ 11:39am 
14700k, 4070 ti super. Ultra settings on Native display mode nets me 110-120 fps.

Maybe 45% CPU utilization at 58c?
GPU runs at 97% utilization at 70c but I have an SFF case so yeah..

Seems to be working alright, no complaints here.
Bunkerotter Mar 24, 2024 @ 11:39am 
Originally posted by Roxo:
Originally posted by Eagle_of_Fire:
This game is CPU heavy. If your CPU is not up to snuff then it is what is going to bottleneck you. It is not rocket science.

And as for "better looking game" that's simply your opinion.

You do realize that saying a game is CPU heavy is the same as saying it's badly made right?
Graphics and Games run on the GPU if made correctly. If a game runs on your CPU it's a BAD sign

It's programming 101...
How to say you have now clue how game engines work, without saying you have no clue how game engines work.

The CPU needs to feed the data to the GPU. The GPU just renders then.

Many things just can't be done on the GPU, and some are difficult to do, or require proprietary solutions. The more that is happening in a scene, the more the CPU has to calculate.

Just take Factorio as an example. It's a 2D Sprite game, runs on any integrated GPU that came out in the past 15 years, but can absolutely max out the fastest CPUs, or even the ones with a lot of cores if there is too much going on in the game.

Or another example: Dwarf Fortress, the original game doesn't even have graphics but is just text based, and it can max out any modern CPU too.

So, tell me, are those games also badly made because they are mostly using the CPU?
Dredd Mar 24, 2024 @ 11:39am 
Originally posted by LuckyStrikes:
Originally posted by Roxo:

You do realize that saying a game is CPU heavy is the same as saying it's badly made right?
Graphics and Games run on the GPU if made correctly. If a game runs on your CPU it's a BAD sign

It's programming 101...
This just in: If a game is made well, it doesn't use the CPU at all. All the pretty lights and shiny textures are graphics card items so that's all that matters. Tide goes in; tide goes out; can't explain that.
Holy what. You actually just typed that and posted it.
Eagle_of_Fire Mar 24, 2024 @ 11:41am 
Originally posted by Bunkerotter:
Originally posted by Roxo:

You do realize that saying a game is CPU heavy is the same as saying it's badly made right?
Graphics and Games run on the GPU if made correctly. If a game runs on your CPU it's a BAD sign

It's programming 101...
How to say you have now clue how game engines work, without saying you have no clue how game engines work.

The CPU needs to feed the data to the GPU. The GPU just renders then.

Many things just can't be done on the GPU, and some are difficult to do, or require proprietary solutions. The more that is happening in a scene, the more the CPU has to calculate.

Just take Factorio as an example. It's a 2D Sprite game, runs on any integrated GPU that came out in the past 15 years, but can absolutely max out the fastest CPUs, or even the ones with a lot of cores if there is too much going on in the game.

Or another example: Dwarf Fortress, the original game doesn't even have graphics but is just text based, and it can max out any modern CPU too.

So, tell me, are those games also badly made because they are mostly using the CPU?
Or, basically... Any good strategy game or any simulation game ever made.
Bunkerotter Mar 24, 2024 @ 11:45am 
Originally posted by Eagle_of_Fire:
Or, basically... Any good strategy game or any simulation game ever made.
Fair point, things like Cities Skyline 1/2, the Anno game series.
But yeah, he obviously has no clue what he's talking about, and probably just has an older quad core intel CPU that is 6+ years old.

Nothing wrong with that, those are still usable CPUs, but the world has advanced since then, and Game Developers are always pushing boundries.

Remember, playing a fast paced shooter like quake on a Pentium III with a Voodoo 3 at 30FPS max was once considered very good.
Last edited by Bunkerotter; Mar 24, 2024 @ 11:45am
Dredd Mar 24, 2024 @ 11:47am 
Originally posted by Bunkerotter:
Originally posted by Eagle_of_Fire:
Or, basically... Any good strategy game or any simulation game ever made.
Fair point, things like Cities Skyline 1/2, the Anno game series.
But yeah, he obviously has no clue what he's talking about, and probably just has an older quad core intel CPU that is 6+ years old.

Nothing wrong with that, those are still usable CPUs, but the world has advanced since then, and Game Developers are always pushing boundries.

Remember, playing a fast paced shooter like quake on a Pentium III with a Voodoo 3 at 30FPS max was once considered very good.
Obligatory "don't forget that 30fps looks a lot better on a CRT" reporting in
Eagle_of_Fire Mar 24, 2024 @ 11:49am 
Originally posted by thedreddnought:
Originally posted by Bunkerotter:
Fair point, things like Cities Skyline 1/2, the Anno game series.
But yeah, he obviously has no clue what he's talking about, and probably just has an older quad core intel CPU that is 6+ years old.

Nothing wrong with that, those are still usable CPUs, but the world has advanced since then, and Game Developers are always pushing boundries.

Remember, playing a fast paced shooter like quake on a Pentium III with a Voodoo 3 at 30FPS max was once considered very good.
Obligatory "don't forget that 30fps looks a lot better on a CRT" reporting in
Strange, I play aforementioned games all the time and I never dip into 30FPS range. Because my computer has been built for that and it is not a potato people claim is a super computer.
Dredd Mar 24, 2024 @ 11:51am 
Originally posted by Eagle_of_Fire:
Originally posted by thedreddnought:
Obligatory "don't forget that 30fps looks a lot better on a CRT" reporting in
Strange, I play aforementioned games all the time and I never dip into 30FPS range. Because my computer has been built for that and it is not a potato people claim is a super computer.
Uh, good for you? I guess?
Did you think I was disagreeing or something?
Bunkerotter Mar 24, 2024 @ 11:53am 
Originally posted by thedreddnought:
Obligatory "don't forget that 30fps looks a lot better on a CRT" reporting in
You obviously didn't got my point I tried to make.
20+ years ago, 30FPS was considered smooth.
10 years ago 60FPS was considered smooth.
nowdays people are spoiled and want 120+ FPS

Which you can get, but it requires very good Parts, as it did 10 years ago to get 60 FPS at max settings, or 20+ years ago to get 30 FPS.
Eagle_of_Fire Mar 24, 2024 @ 11:53am 
Originally posted by thedreddnought:
Originally posted by Eagle_of_Fire:
Strange, I play aforementioned games all the time and I never dip into 30FPS range. Because my computer has been built for that and it is not a potato people claim is a super computer.
Uh, good for you? I guess?
Did you think I was disagreeing or something?
How is complaining on outdated hardware on FPS that most people would never want to consider using not negative?
Last edited by Eagle_of_Fire; Mar 24, 2024 @ 11:54am
DodgeZero Mar 24, 2024 @ 11:54am 
Ryzen 5600x Nvidia 3070ti

I had 90-100% CPU Usage and low 50-60% GPU Usage an what helped were settings in nvidia driver.

Threaded optimisation allows the driver to offload certain GPU-related processing tasks as separate thread(s) on available CPU cores. Fine in GPU bound scenarios, but a nightmare when the CPU is in 90-100% usage.


In nvidia control panel turning off threaded optimisation (from auto) and setting gpu power management to prefer maximum performance (from normal) decreased cpu utilisation by approximately 20% and increased gpu utilisation by around 15%.

Considering I was hitting around high 100% on the cpu, the improvement is noticeable with less stutter and hitching and generally more consistent and higher frame rates.

Give it a try.



Greetz
Last edited by DodgeZero; Mar 24, 2024 @ 11:55am
Dredd Mar 24, 2024 @ 11:55am 
Originally posted by Bunkerotter:
Originally posted by thedreddnought:
Obligatory "don't forget that 30fps looks a lot better on a CRT" reporting in
You obviously didn't got my point I tried to make.
20+ years ago, 30FPS was considered smooth.
10 years ago 60FPS was considered smooth.
nowdays people are spoiled and want 120+ FPS

Which you can get, but it requires very good Parts, as it did 10 years ago to get 60 FPS at max settings, or 20+ years ago to get 30 FPS.
JFC you people are insane. I was just adding to the conversation. Excuse me for raining on your little parade or whatever.

And yes, I understood your point. You're not that complicated lol
Dredd Mar 24, 2024 @ 11:55am 
Originally posted by Eagle_of_Fire:
Originally posted by thedreddnought:
Uh, good for you? I guess?
Did you think I was disagreeing or something?
How is complaining on outdated hardware on FPS that most people would never want to consider using not negative?
Where did I complain?
Doko Mar 24, 2024 @ 11:58am 
Okay so I've been noticing people have suggested things like running the game in DX 11 mode, and while that normally works for people, for me it did not work at all.

That was until last night when I discovered something when I was testing between DX 11-12 and purging the shader cache's for each test. There is your user config file located where the shader ache is found (a few folders before it).

Open the suer config file and scroll down or use Cntrl+F and type in "Async compute", and you will find it is set to true.

Now for me, I own a GTX 1080ti, and ever since Async Compute became a thing, it did nothing but hamper my overall performance. I have to root arounnd in some game suer configs, because some games these days do not outright give you the option to toggle it off in-game.

SO now I've had it toggled off since last night, and so far I've been able to reach my fps cap of 60fps without getting stupidly heavy dips in my FPS, but I'll still be doing more testing methodology as time goes on, but for now, I do know Async Compute for my card was a common issue (it has always been an issue for my card with every game having it enabled).

So for those of you running in DX 11 mode, go to your user config file and make sure Async Compute is set to false, not true.

Apart from that hurdle, I really am getting irked by the radio silence from the dev front when it comes to overall performance optimisations. Most games tend to have perf fixes dolled out within the first few weeks, but with HD2 is has now been a month and a bit, and we've had no word on perf improvements of any sort (excluding P2P server fixes).
Last edited by Doko; Mar 24, 2024 @ 12:00pm
Doko Mar 24, 2024 @ 12:06pm 
Originally posted by AKADodgeZero:
Ryzen 5600x Nvidia 3070ti

I had 90-100% CPU Usage and low 50-60% GPU Usage an what helped were settings in nvidia driver.

Threaded optimisation allows the driver to offload certain GPU-related processing tasks as separate thread(s) on available CPU cores. Fine in GPU bound scenarios, but a nightmare when the CPU is in 90-100% usage.


In nvidia control panel turning off threaded optimisation (from auto) and setting gpu power management to prefer maximum performance (from normal) decreased cpu utilisation by approximately 20% and increased gpu utilisation by around 15%.

Considering I was hitting around high 100% on the cpu, the improvement is noticeable with less stutter and hitching and generally more consistent and higher frame rates.

Give it a try.



Greetz


Yeah people who post those Nvidia control panel "best settings" vids tend to forget how Threaded optimisation actually works and what it means for games where you're CPU bound. Most just flat out ignore mentioning what that can do in general, but they'll still say "leave it on". Same goes for Gamma correction and ppl claiming to force x16 AF, when you're basically asking the game to render AF twice in the game, which causes more of a load you really didn't need (if you need to force AF via the control panel, disable it in-game at the very least).

Also, like in my post above, people with older cards like mine really need to look out and see if Async compute is enabled, because with my 1080ti, it actually gives me worse perf than an uplift like it does for the latest models. Disabling it also helped for me.
Last edited by Doko; Mar 24, 2024 @ 12:08pm
< >
Showing 16-30 of 209 comments
Per page: 1530 50

Date Posted: Mar 24, 2024 @ 10:25am
Posts: 209