HELLDIVERS™ 2

HELLDIVERS™ 2

View Stats:
BFG 9000 Apr 15, 2024 @ 6:53am
the game is way to heavy on the CPU
i never in my life seen a game that uses so much of the CPU .
like my CPU is at 90% and my GPU is round 60% 80% ......
devs pls do something about this ....
5800x3d / 7800xt
< >
Showing 61-75 of 251 comments
Patriot03 Apr 18, 2024 @ 12:50pm 
Originally posted by Deep Rock Galactic:

Lol, you think the server handles everything in BF? Almost all the demanding visual calculations are on the client. You guys are coping hard.

Also AI is dirt cheap. AI of bugs is not real AI. It's not machine learning. Its basic scripting that makes them rush right to you...


Graphical effects are some of the most slim and optimized scripts out there, many running on the GPU, where they use the CPU it's for calculating really primitive data and maybe caching an average over the past few frames asynchronously to GPU rendering.


It's not at all comparable to the complex math that AI uses, I find the AI of THESE bugs to be far superior to your old school hoard bugs (Which are still more expensive than a particle effect). You're just being obtuse and you obviously don't understand Hardware/Rendering.
Patriot03 Apr 18, 2024 @ 12:53pm 
Originally posted by Q I | I X:
What ever the clown factory in sweden is coding it is funny to assume you don't get anything past 80-110 fps with a 5950x and 4090, I tried setting everything to low + low res and + ultra performance upscaler. Guess what same marginal low FPS. Get your freakin game together

I get the same FPS at 1440p 100% res scale ultra settings on a 3080 ti, but I'm on a 5800X3D. Set your game up to render 4k native and if the lowest you go is 80 FPS that isn't bad for a PvE game with loads of relatively complex AI.
Patriot03 Apr 18, 2024 @ 1:01pm 
Originally posted by Deep Rock Galactic:

Noone here is arguing that nothing should be 100%. The issue is that CPU is disproportionally 100%. A game that uses more GPU than CPU usually means terrible optimization. GPU can literally do most of the calculations.


I'm sorry that you're this wrong, but you've literally got it completely backwards.


Preferably and ideally your GPU should NEVER be waiting on the CPU which means it should be free to churn out frames as fast as it possibly can which will result in 100% GPU usage.

Which means you WANT your GPU usage to be higher than your CPU usage, however the wording is weird here since usage and bottlenecks aren't intrinsically the same thing. both could be at "100% usage" and the GPU still wouldn't be limited by the CPU, but generally speaking a lay person could think that way and be correct 90% of the time.

Also the GPU never does any game calculations, like ever. It's completely unheard of. The reason for this is likely beyond the scope of this conversation but basically GPUs are good at calculating loads of relatively simple math (even if large numbers) that rely on a minimal amount of variables that don't need to talk to each other across threads.

The CPU is significantly better at crunching complex long chain operations that require information from previous (or concurrent) calculations and variables, such as physics or game logic. These kinds of calculations don't lend themselves well to multi-threading.


GPU - math that can be simply multi-threaded (Slower on long operations but it can compute MANY MANY smaller operations at once)

CPU - Math that intrinsically cannot be multi-threaded or that would be overly complex to mutli-thread (+ it's really strong for extremely long operations)


Anyway, game logic is done on the CPU and that's always the case.
Last edited by Patriot03; Apr 18, 2024 @ 1:05pm
Patriot03 Apr 18, 2024 @ 1:02pm 
Originally posted by Deep Rock Galactic:

Wrong. "AI" of this basic level is extremely less demanding that anything graphical.

Bro you know literally nothing about this subject. Please stop.
dyra55 Apr 18, 2024 @ 1:13pm 
To be fair, Helldivers is one of the few games that actually uses all 20 threads of my old ass CPU, so at least it has that going for it
Last edited by dyra55; Apr 18, 2024 @ 1:13pm
Patriot03 Apr 18, 2024 @ 1:16pm 
Originally posted by Deep Rock Galactic:
Telling me things i already know in needless detail do not make you look smarter, when in the end you form your opinion upon your emotional attachment to this game.

I know how scripts work. I know what rendering is. I know what this game has, which is how with all of this i am telling you as fact that all of its performance issues in no way come from any complexity or advanced calculation that does not exist in other games. None.

The ONLY factor in this is human talent, and deadlines.

I don't want to insult you but it's obvious there are blatant misunderstandings at play here regarding the computing/rendering of games.



I enjoy the game, but I also enjoy well optimized games and I despise unoptimized games. This game runs fine for me 90-100+ FPS at 1440p at 100% res scale max graphics on a 5800x3D while playing as the host. (an admittedly strong CPU to be sure) and a 3080 ti (same deal).


I'm not left wanting for more FPS on my high refreshrate display with my configuration but considering the fact that this game clearly performs WAY more complex CPU calculations compared to something like Battlefield I could understand someone being upset if their relatively weak CPU can't keep up.


There's a reason why PvE shooters of this complexity (you say the AI isn't complex but it really is WAY better than the AI in something like Killing Floor which is your basic hoard AI) haven't been a thing for so long is because compared to our GPU horsepower of today CPU strength really hasn't kept up.

To manage everything a game like this requires, even if flawlessly optimized is intrinsically demanding on the CPU and critically CPU cache which is why games LOVE AMD's 3D stacked cache. It sucks to hear it but I believe the game IS optimized but it's intrinsically demanding, the only way it will run better on your setup is if you upgrade it unfortunately.


I'd recommend any of the 3D cache CPUs. Being an AMD fanboy at heart I'd recommend Ryzen either the 5800x3D that I use or the new 7XXXx3D variants, as games actually start to utilize the CPU to enhance gameplay we're going to need better CPUs.


My 5800x3D has no problems, by today's standards that's hardly some space age uber expensive part and there are even better/newer options.
Last edited by Patriot03; Apr 18, 2024 @ 1:29pm
retroquark Apr 18, 2024 @ 1:16pm 
Originally posted by Patriot03:
Originally posted by Deep Rock Galactic:

Wrong. "AI" of this basic level is extremely less demanding that anything graphical.

Bro you know literally nothing about this subject. Please stop.
..I mean, it's not wrong. It depends on how things are programmed... you can make pathing that is very heavy, but most games don't do that.. has to do with creative choices here - but pathfinding and node-generation generally doesn't take a lot of resources. Are lots of games where (continuously) preparing for graphics operations are massively heavier on the cpu than node-generation and pathfinding.

In any case, even if the pathfinding in Helldivers 2 is - for the time being - a bit more complicated than in most games - it's not going to struggle on an 8-core with shared cache (never mind at 4,5Ghz).

But knowing Sony, they have already noted that "many players are complaining about high cpu-usage", and sent a "request" to Arrowhead threatening with pulling all support for the game if they don't axe another feature that "no one but you cares about"...

So thank you, all of you noisy guys who don't ..qualify anything you say, and generally just blurt out the first and best thought that comes through, while trying to inflate the importance of it by making it sound generally true for All Gamers(tm). Really great.

Because every other person who plays the game but -- don't spend hours noising on a forum about some issue that probably is extremely specific to your computer -- will of course thank you from the bottom of their hearts when 90% of the "AI" in Helldivers 2 is replaced with a homing beacon, and all the patrols turn up right on top of you to save potential pathing issues -- so that the game finally turns into a mindless horde-shooter.

Clearly that's what all gamers want. And no one can tell the difference if it's not, because BLALRRRRTGHGHHGHGHG INTERNET NOISE BLAARGH
Patriot03 Apr 18, 2024 @ 1:25pm 
Originally posted by retroquark:
Originally posted by Patriot03:

Bro you know literally nothing about this subject. Please stop.
..I mean, it's not wrong. It depends on how things are programmed... you can make pathing that is very heavy, but most games don't do that.. has to do with creative choices here - but pathfinding and node-generation generally doesn't take a lot of resources. Are lots of games where (continuously) preparing for graphics operations are massively heavier on the cpu than node-generation and pathfinding.

Of course there are, most games that prioritize optimization avoid complex CPU calculations like the plague, including myself on my own game projects I've worked on. In such a scenario the CPU is left with only the most basic tasks, of course the majority of the workload will then be those simple calculations. - And or they could mess something up and make an unreasonably expensive draw thread which could just be inefficient on top of that.

Originally posted by retroquark:
In any case, even if the pathfinding in Helldivers 2 is - for the time being - a bit more complicated than in most games - it's not going to struggle on an 8-core with shared cache (never mind at 4,5Ghz).

But knowing Sony, they have already noted that "many players are complaining about high cpu-usage", and sent a "request" to Arrowhead threatening with pulling all support for the game if they don't axe another feature that "no one but you cares about"...

So thank you, all of you noisy guys who don't ..qualify anything you say, and generally just blurt out the first and best thought that comes through, while trying to inflate the importance of it by making it sound generally true for All Gamers(tm). Really great.

Because every other person who plays the game but -- don't spend hours noising on a forum about some issue that probably is extremely specific to your computer -- will of course thank you from the bottom of their hearts when 90% of the "AI" in Helldivers 2 is replaced with a homing beacon, and all the patrols turn up right on top of you to save potential pathing issues -- so that the game finally turns into a mindless horde-shooter.

Clearly that's what all gamers want. And no one can tell the difference if it's not, because BLALRRRRTGHGHHGHGHG INTERNET NOISE BLAARGH

I agree with you here, the game runs fine on my system and others I've seen on modern CPUs, as for your concern about Sony making cuts I'm not concerned here because developers/studios RARELY care that much about performance
^
I wish they would in most cases, even if in this case I think it's fine and I really wouldn't want the AI to be dumbed down.

But yea they dgaf about people complaining about performance, they never do. Sadly.
Last edited by Patriot03; Apr 18, 2024 @ 1:26pm
Patriot03 Apr 18, 2024 @ 1:30pm 
Originally posted by Deep Rock Galactic:
Stop the walls of text, noone will read that, i already answered you, that the AI in this game is nothing special that does not warrant the performance cost. The performance drops come from bad optimization..Writing 1 paragraph or 20 won't change that, because your argument collapses the moment you compare this to ANY other game.

The other games barely touch the CPU, we already said this. If you bothered to read those walls of text you'd know this.

I guess your brain works like a GPU, not the best when it comes to long form operations..
(I hope you take that as a light hearted joke)




PS: Also I want to stress this again, high CPU usage doesn't mean it isn't optimized and it doesn't even mean it will run bad on your system. - In a perfect world CPU and GPU usage would be 100% all the time because that just means it's leveraging your hardware. It's not indicative of low or high framerate.
Last edited by Patriot03; Apr 18, 2024 @ 1:34pm
BigBallinChester Apr 18, 2024 @ 1:37pm 
I love this game but ♥♥♥♥♥♥♥♥♥ it runs like a 2009 game with graphics to boot
Patriot03 Apr 18, 2024 @ 1:38pm 
OP has complained about high CPU usage as if it was a bad thing, but having the same CPU as him I know the game runs well on the 5800x3D. So it's someone complaining about something they don't understand just because they see a number that they THINK is a problem.
^Probably because for over a decade games have usually failed to use anymore than 4 cores, resulting at max load maybe 50% CPU usage even if the individual cores are maxed out and can't keep up.

Even though it's actually just an indicator that a video game is finally leveraging his 8 cores/16 threads to do something (in this case at a high framerate which is great news). If your GPU is waiting around why not increase the Render resolution slider to at least native to give it something to do? If you have the headroom and the FPS is already in high refreshrate territory..
Last edited by Patriot03; Apr 18, 2024 @ 1:42pm
retroquark Apr 18, 2024 @ 1:51pm 
Originally posted by Patriot03:
Of course there are, most games that prioritize optimization avoid complex CPU calculations like the plague, including myself on my own game projects I've worked on. In such a scenario the CPU is left with only the most basic tasks, of course the majority of the workload will then be those simple calculations. - And or they could mess something up and make an unreasonably expensive draw thread which could just be inefficient on top of that.
Sure.. but you're going back and forth from "optimising existing code" to "removing features that make the game what it is out of performance concerns" with no effort at all here.

I'm just saying that Arrowhead might have wanted node-generation for mobs, pathing for patrols, and a more immediate type of client update than what is typically chosen. Helldivers 1 had that. Magicka certainly had that. And I'd go so far as to say that the previous games, and Helldivers 2, would not be the same game if they used more simple node-generation.

Because without that, you wouldn't have something happen in a game that didn't happen in the last game. There'd be no uncertainty when playing the game. You would never be surprised. You would know exactly what is going to happen, and the game would be extremely boring.

But you're still suggesting, completely without any qualifications, that the game should have features removed because certain things are simply "inefficient"?

Originally posted by Patriot03:
I agree with you here, the game runs fine on my system and others I've seen on modern CPUs, as for your concern about Sony making cuts I'm not concerned here because developers/studios RARELY care that much about performance
^
I wish they would in most cases, even if in this case I think it's fine and I really wouldn't want the AI to be dumbed down.

But yea they dgaf about people complaining about performance, they never do. Sadly.
Well, I've been in Sony productions (and EA productions) where feedback like this was picked up on(by Baskin's colleagues), not really evaluated technically, and given to the developers as justification for removing large parts of a meticulously set up algorithm-budget.

As you say, they don't care about the performance, or the game - but what they do care about is that there is one noisy guy on the internet who is giving the game bad "buzz", or that there are potential customers out there that might be dissuaded from getting the game. They also argue, very specifically, that if there is just one player who can't play the game - then that's a good reason to change the game if it is at all possible.

And if the developer then says: well, you know, that's going to be a lot of work, and it's going to make the game really bad and boring -- they're going to say: but here's a bunch of money, and a 2 year "support" agreement. We just want to "make your game better". And the developer will do it. If they don't, which happened more than once, the publisher starts threatening with removing support, because they will hear from people like Baskin and Musterbuster (who also work for other publishers than Sony) that the developer is making their job difficult.

You see the setup very clearly now, with a studio that is celebrated because they have a creative vision and just followed it. I've sat and talked to developers in and out of house at Sony before, and had every publisher-spawn applaud them every step of the way. Until the issues start turning up that conflict with demands like what I described: they want the game to not have visible lag when playing with people from Mars, they want everyone to be able to connect to every game. They basically are asking for impossible things. And when the developer says things like: look, we know what we were doing, and there is no good fix to remove internet lag - are you insane? You can't magick away internet lag. Or: what we put in of effects and updates between very few clients was comparatively high, because we were expecting the small lobby to be between somewhat local hosts in the same continent, at least. And changing that and slimming down the event update pool is not really something we want to do. When that happens - which it has, multiple times before - then the publisher will suddenly not be happy about the independent vision of the studio any more.

And you need to understand that this problem comes to a massive head when the game becomes popular. Because corporate hq are going to genuinely believe that if the developer doesn't "fix their game", then they are going to lose money. They just don't understand that they may only have had a hit because of these non-standard solutions that the developer had, that made the game stand out. They just don't get that.

Which is not strange, because argumentation like what you presented just now is rampant, even from developers from large and successful studios: you don't know what solution they chose. But you still say, without reservation, that there are ways to make things more efficient. And you even say that you think they should be doing it, even though you don't know what that would actually cause the game to look like. It's just not part of the discussion.

But it should be. And you would agree with that if you at all had any interest in what games and games-development could be like, in spite of what the industry in general has become.
Patriot03 Apr 18, 2024 @ 2:05pm 
^ I think you're confusing me for someone else. I don't want the game to cut any features and I already consider it to be well optimized.
ZiggyDeath Apr 18, 2024 @ 2:06pm 
Originally posted by Tiramisu:
Originally posted by ZiggyDeath:
You are CPU bound if your videocard is effectively sleeping (read anything other than max boost and 95% GPU utilization).
This is generally correct, but there are outliers. HD2 happens to be one of those outliers. Look at the clock on my GPU
https://steamcommunity.com/sharedfiles/filedetails/?id=3225895230
It doesnt boost, at all! There is nothing wrong with it, I have it a year and a half and it behaves as expected in pretty much any game - I believe I had one other title which also did this, but a 100 others that performed as expected.

And its not like Im going that easy on the GPU, it is 4K maxed. I have it capped at 100fps, so its understandable that the GPU is not pegged to 100%. Even so, it is a surprise to me that it is able to do this (yes, even in the most hectic scenes in the game) it doesnt boost. But as long as it maintains the 100fps at all times, IDC to investigate further.
The OP did not indicate specifics regarding his settings, therefore the general assumption is that he did not introduce artificial caps.

Yes your videocard is snoozing, but that's artificially induced by your framecap. Given how powerful a 4090 is, it is likely that your videocard is probably going to still not be going full tilt even without that framecap as it will likely get bottlenecked by the CPU.

However the 5800X3D should vastly outperform a 7800XT in HD2.

On the ship, with a mix of med/high settings at 1440p, the 5800X3D has a main/render time of ~5.5ms, meaning it can drive well in excess of 150fps. There is no way that his videocard is snoozing unless his settings are in potato mode, using FSR, or probably both - and it's not likely because he's frame capping because his CPU utilization is so high...

Even if he's reading his 1 thread maximum usage as being his whole CPU utilization, his videocard should be struggling to keep pace with his CPU.

*addition* It could be possible he is vsync'd and he's reading his 1 thread maximum to produce those kinds of utilization numbers.
Last edited by ZiggyDeath; Apr 18, 2024 @ 2:16pm
Patriot03 Apr 18, 2024 @ 2:12pm 
The OP didn't even tell us what his framerate was, he's literally just complaining that CPU usage is high because he thinks that's a bad thing.

It's not intrinsically bad nor is it an indicator of poor or good famerate.

I have a 5800x3D, same as the OP and the game runs great at 80-100 FPS as the host with a 3080 ti with max settings native res scale at 1440p, his GPU should be more or less comparable. I have to assume the OP has decent framerates...
< >
Showing 61-75 of 251 comments
Per page: 1530 50

Date Posted: Apr 15, 2024 @ 6:53am
Posts: 251