Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Graphical effects are some of the most slim and optimized scripts out there, many running on the GPU, where they use the CPU it's for calculating really primitive data and maybe caching an average over the past few frames asynchronously to GPU rendering.
It's not at all comparable to the complex math that AI uses, I find the AI of THESE bugs to be far superior to your old school hoard bugs (Which are still more expensive than a particle effect). You're just being obtuse and you obviously don't understand Hardware/Rendering.
I get the same FPS at 1440p 100% res scale ultra settings on a 3080 ti, but I'm on a 5800X3D. Set your game up to render 4k native and if the lowest you go is 80 FPS that isn't bad for a PvE game with loads of relatively complex AI.
I'm sorry that you're this wrong, but you've literally got it completely backwards.
Preferably and ideally your GPU should NEVER be waiting on the CPU which means it should be free to churn out frames as fast as it possibly can which will result in 100% GPU usage.
Which means you WANT your GPU usage to be higher than your CPU usage, however the wording is weird here since usage and bottlenecks aren't intrinsically the same thing. both could be at "100% usage" and the GPU still wouldn't be limited by the CPU, but generally speaking a lay person could think that way and be correct 90% of the time.
Also the GPU never does any game calculations, like ever. It's completely unheard of. The reason for this is likely beyond the scope of this conversation but basically GPUs are good at calculating loads of relatively simple math (even if large numbers) that rely on a minimal amount of variables that don't need to talk to each other across threads.
The CPU is significantly better at crunching complex long chain operations that require information from previous (or concurrent) calculations and variables, such as physics or game logic. These kinds of calculations don't lend themselves well to multi-threading.
GPU - math that can be simply multi-threaded (Slower on long operations but it can compute MANY MANY smaller operations at once)
CPU - Math that intrinsically cannot be multi-threaded or that would be overly complex to mutli-thread (+ it's really strong for extremely long operations)
Anyway, game logic is done on the CPU and that's always the case.
Bro you know literally nothing about this subject. Please stop.
I don't want to insult you but it's obvious there are blatant misunderstandings at play here regarding the computing/rendering of games.
I enjoy the game, but I also enjoy well optimized games and I despise unoptimized games. This game runs fine for me 90-100+ FPS at 1440p at 100% res scale max graphics on a 5800x3D while playing as the host. (an admittedly strong CPU to be sure) and a 3080 ti (same deal).
I'm not left wanting for more FPS on my high refreshrate display with my configuration but considering the fact that this game clearly performs WAY more complex CPU calculations compared to something like Battlefield I could understand someone being upset if their relatively weak CPU can't keep up.
There's a reason why PvE shooters of this complexity (you say the AI isn't complex but it really is WAY better than the AI in something like Killing Floor which is your basic hoard AI) haven't been a thing for so long is because compared to our GPU horsepower of today CPU strength really hasn't kept up.
To manage everything a game like this requires, even if flawlessly optimized is intrinsically demanding on the CPU and critically CPU cache which is why games LOVE AMD's 3D stacked cache. It sucks to hear it but I believe the game IS optimized but it's intrinsically demanding, the only way it will run better on your setup is if you upgrade it unfortunately.
I'd recommend any of the 3D cache CPUs. Being an AMD fanboy at heart I'd recommend Ryzen either the 5800x3D that I use or the new 7XXXx3D variants, as games actually start to utilize the CPU to enhance gameplay we're going to need better CPUs.
My 5800x3D has no problems, by today's standards that's hardly some space age uber expensive part and there are even better/newer options.
In any case, even if the pathfinding in Helldivers 2 is - for the time being - a bit more complicated than in most games - it's not going to struggle on an 8-core with shared cache (never mind at 4,5Ghz).
But knowing Sony, they have already noted that "many players are complaining about high cpu-usage", and sent a "request" to Arrowhead threatening with pulling all support for the game if they don't axe another feature that "no one but you cares about"...
So thank you, all of you noisy guys who don't ..qualify anything you say, and generally just blurt out the first and best thought that comes through, while trying to inflate the importance of it by making it sound generally true for All Gamers(tm). Really great.
Because every other person who plays the game but -- don't spend hours noising on a forum about some issue that probably is extremely specific to your computer -- will of course thank you from the bottom of their hearts when 90% of the "AI" in Helldivers 2 is replaced with a homing beacon, and all the patrols turn up right on top of you to save potential pathing issues -- so that the game finally turns into a mindless horde-shooter.
Clearly that's what all gamers want. And no one can tell the difference if it's not, because BLALRRRRTGHGHHGHGHG INTERNET NOISE BLAARGH
Of course there are, most games that prioritize optimization avoid complex CPU calculations like the plague, including myself on my own game projects I've worked on. In such a scenario the CPU is left with only the most basic tasks, of course the majority of the workload will then be those simple calculations. - And or they could mess something up and make an unreasonably expensive draw thread which could just be inefficient on top of that.
I agree with you here, the game runs fine on my system and others I've seen on modern CPUs, as for your concern about Sony making cuts I'm not concerned here because developers/studios RARELY care that much about performance
^
I wish they would in most cases, even if in this case I think it's fine and I really wouldn't want the AI to be dumbed down.
But yea they dgaf about people complaining about performance, they never do. Sadly.
The other games barely touch the CPU, we already said this. If you bothered to read those walls of text you'd know this.
I guess your brain works like a GPU, not the best when it comes to long form operations..
(I hope you take that as a light hearted joke)
PS: Also I want to stress this again, high CPU usage doesn't mean it isn't optimized and it doesn't even mean it will run bad on your system. - In a perfect world CPU and GPU usage would be 100% all the time because that just means it's leveraging your hardware. It's not indicative of low or high framerate.
^Probably because for over a decade games have usually failed to use anymore than 4 cores, resulting at max load maybe 50% CPU usage even if the individual cores are maxed out and can't keep up.
Even though it's actually just an indicator that a video game is finally leveraging his 8 cores/16 threads to do something (in this case at a high framerate which is great news). If your GPU is waiting around why not increase the Render resolution slider to at least native to give it something to do? If you have the headroom and the FPS is already in high refreshrate territory..
I'm just saying that Arrowhead might have wanted node-generation for mobs, pathing for patrols, and a more immediate type of client update than what is typically chosen. Helldivers 1 had that. Magicka certainly had that. And I'd go so far as to say that the previous games, and Helldivers 2, would not be the same game if they used more simple node-generation.
Because without that, you wouldn't have something happen in a game that didn't happen in the last game. There'd be no uncertainty when playing the game. You would never be surprised. You would know exactly what is going to happen, and the game would be extremely boring.
But you're still suggesting, completely without any qualifications, that the game should have features removed because certain things are simply "inefficient"?
Well, I've been in Sony productions (and EA productions) where feedback like this was picked up on(by Baskin's colleagues), not really evaluated technically, and given to the developers as justification for removing large parts of a meticulously set up algorithm-budget.
As you say, they don't care about the performance, or the game - but what they do care about is that there is one noisy guy on the internet who is giving the game bad "buzz", or that there are potential customers out there that might be dissuaded from getting the game. They also argue, very specifically, that if there is just one player who can't play the game - then that's a good reason to change the game if it is at all possible.
And if the developer then says: well, you know, that's going to be a lot of work, and it's going to make the game really bad and boring -- they're going to say: but here's a bunch of money, and a 2 year "support" agreement. We just want to "make your game better". And the developer will do it. If they don't, which happened more than once, the publisher starts threatening with removing support, because they will hear from people like Baskin and Musterbuster (who also work for other publishers than Sony) that the developer is making their job difficult.
You see the setup very clearly now, with a studio that is celebrated because they have a creative vision and just followed it. I've sat and talked to developers in and out of house at Sony before, and had every publisher-spawn applaud them every step of the way. Until the issues start turning up that conflict with demands like what I described: they want the game to not have visible lag when playing with people from Mars, they want everyone to be able to connect to every game. They basically are asking for impossible things. And when the developer says things like: look, we know what we were doing, and there is no good fix to remove internet lag - are you insane? You can't magick away internet lag. Or: what we put in of effects and updates between very few clients was comparatively high, because we were expecting the small lobby to be between somewhat local hosts in the same continent, at least. And changing that and slimming down the event update pool is not really something we want to do. When that happens - which it has, multiple times before - then the publisher will suddenly not be happy about the independent vision of the studio any more.
And you need to understand that this problem comes to a massive head when the game becomes popular. Because corporate hq are going to genuinely believe that if the developer doesn't "fix their game", then they are going to lose money. They just don't understand that they may only have had a hit because of these non-standard solutions that the developer had, that made the game stand out. They just don't get that.
Which is not strange, because argumentation like what you presented just now is rampant, even from developers from large and successful studios: you don't know what solution they chose. But you still say, without reservation, that there are ways to make things more efficient. And you even say that you think they should be doing it, even though you don't know what that would actually cause the game to look like. It's just not part of the discussion.
But it should be. And you would agree with that if you at all had any interest in what games and games-development could be like, in spite of what the industry in general has become.
Yes your videocard is snoozing, but that's artificially induced by your framecap. Given how powerful a 4090 is, it is likely that your videocard is probably going to still not be going full tilt even without that framecap as it will likely get bottlenecked by the CPU.
However the 5800X3D should vastly outperform a 7800XT in HD2.
On the ship, with a mix of med/high settings at 1440p, the 5800X3D has a main/render time of ~5.5ms, meaning it can drive well in excess of 150fps. There is no way that his videocard is snoozing unless his settings are in potato mode, using FSR, or probably both - and it's not likely because he's frame capping because his CPU utilization is so high...
Even if he's reading his 1 thread maximum usage as being his whole CPU utilization, his videocard should be struggling to keep pace with his CPU.
*addition* It could be possible he is vsync'd and he's reading his 1 thread maximum to produce those kinds of utilization numbers.
It's not intrinsically bad nor is it an indicator of poor or good famerate.
I have a 5800x3D, same as the OP and the game runs great at 80-100 FPS as the host with a 3080 ti with max settings native res scale at 1440p, his GPU should be more or less comparable. I have to assume the OP has decent framerates...