Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The thing is it can happen in any game if you are CPU limited because you are still playing on a Ryzen 3600 or similar.
This is one of those "now you tell us" ,kinds of things. We want new, fresh stuff but don't want to hear about the down-side.
This is exactly my situation now but I actually don't play demanding games, maybe semi sometimes. In any case, I am seriously cpu-bound. i9 9900/rtx4070. cpu-100% gpu--60%.
i really doubt the i9-9900k gets maxed by non demanding games...
People should be warned. I almost made a similar mistake with my Ryzen 3950 and upgrade to nvidia 4090 or 4080.
So you either upgrade the whole PC with NVIDIA or go AMD if you want just the GPU.
I wish people wouldn't stress so much over utilization numbers like there's some balance they should be at. PCs don't work that way. Just enjoy the performance you have, and upgrade if you want more. Looking at utilization numbers is really only important if you're seeking an upgrade and need to best find out what part will bring the performance improvement you seek.
And after all, I'm presuming you upgraded from the GTX 1080 because you wanted more performance and determined the GPU was indeed responsible for what was limiting your performance before.
You're always going to have a bottleneck. Who cares where it is? Is the performance you're getting to a level you're happy with? If so, there's no problems. If not, upgrade the CPU next. But if you're only doing it to strike some balance that doesn't exist, be prepared to be stressing over nothing as you constantly make hardware changes to chase something that doesn't exist.
TL;DR: Don't worry and create problems for yourself, be happy. If your new hardware is providing you with a level of performance you find acceptable then just enjoy it. There's always going to be shortcomings to almost everything. If it's not affecting you then don't let it rain on your own parade.
But in isolation, when you get that bottleneck info, it's kind of a jolt. But in this case, not a big deal. I mean: the most "demanding" game in my library right now is "Stray" for heaven's sake. But yes, you always want that room to expand and that's my intention here.
I dont realy have any problem but msi afterburner allways running. Not because of overclocking but rather just watching my cpu and gpu temps so if there is anything wrong i can intervene before something bad happens. Im using this system about a 2 moths now.
About 2 weeks ago i saw my cpu some how manage to get 600w momentarly i was playing Division 2 i havent realy saw any performance issue. I mean yea i constantly crash at Division 2 just like everybody else thats how i realize when i return to windows. I checked termals my pump and everything seems ok. I clean it up and move on. It never did happend again but i still dont know if that was an error from msi afterburner or it was real. If it is realy and if it does that ♥♥♥♥ more then 10 second my whole system could fried. Its kind a worrying.
Stray really impressed me (both in gameplay and performance). I can't remember offhand the exact settings I was playing at but my GTX 1060 handled it very well (in my case, this just means staying near or at 60 FPS at 1920 x 1200, which it did). In particular, I was shocked at the low VRAM use numbers I saw for how pretty it is. I remember before it came out worrying if I'd be able to play at higher settings based on the pictures alone. It looks like it would be a bit more demanding. I wish every game ran as well as it.
If that's actually the most demanding thing you do, I would not at all be surprised to something that's, what, three and a half to four and half times my GPU not being fully utilized in that game, let alone lesser demanding ones.
So I wouldn't get hung up on utilization numbers if it's giving you performance you're happy. With the way the PC gaming market is going, games are going to need it eventually. I'm worried about Silent Hill 2. That's going to be the "big" game that I'll need to have upgraded by time it comes.
I would take that is an error/false reading. I'd imagine that is near impossible.
AMD used to be behind when using a weaker CPU in DX11 games, due to a similar overhead issue, likewise, that's the case now for nVidia drivers in DX12. But this should not matter as long as you have a powerful enough CPU....
I never really like those kinds of benchmarks as they are not exactly realistic, if you have a top end cpu and a 4090 you are going to be playing at 1440p minimum if not 4k, I kind of see these as more academic than real world.
also, this is another point, the ram speed on the 9000 vs the 13000, when you are pushing higher fps, ram is actually a bottleneck again, years ago ram was just kind of if you have the space thats all that matters, but we got to a point that ram can change fps by quite a bit.
If I remember correctly, for a long time amds primitive stuff, essentially the first line in the gpu pipeline was only capable of 4, while nvidia was capable of 10-12 making it less of a bottleneck area, its why amd has stupidly powerful gpus on paper but it never translated, and this caused quite alot of driver overhead for them, granted I think this issue has been solved with rdna.
mantle is amds api for low level hardware access, and its what vulcan/dx12 are based on in part or entirely, this put nvidia on the backfoot while amd was already ready
you then had, not sure currently, amd having a hardware scheduler which when utilized was VERY good, but nvidia using a software scheduler which was flexible. this saw nvidia able to put out 'game ready drivers' with ease, but this also saw nvidia use more cpu power as a result, but given games were single or 2 core for so damn long, this was never an issue. however now that games are kind of utilizing more threads, it is a bit of an issue now.
R7 3700x
32gb 3600mhz CL16
RTX 3080 12GB
3440x1440 144hz display
The CPU can be a limiting factor, but I'm rarely under utilising the GPU. I am considering the 5800x3d (as prices fall) but only to reduce the impact in VR which is more CPU bound (extra layers of software).
Obviously if I was pushing a lower resolution then my system would be very unbalanced.
However. with the trash that is being released as games today I'm rapidly losing interest in the platform. Go back a few years and I would have brought that 5800x3d by now. Instead, I'm spending my money on better things.
if the cpu cant keep up preparing frames for the gpu to draw, raise gpu settings, res to give the gpu more work, or get a faster cpu that can handle higher draw rate
most games are limited by cpu single core performance, cpu not at 100% can still be limited by cpu
why does that matter if you want to show cpu utilisation which is best shown if you remove a gpu bottleneck?
because it's pure academics, im more interested in what the game is actually going to perform like.
let me put it this way, the difference between 60 and 75 is a massive jump, again at 75-90 and 90-120 then a lesser but still notable from 120 to 144, the difference between 144 and 165/185 is negligible, same with 240, in all honesty, once you hit 144 you gain more form getting a monitor that ACTUALLY shifts the pixels at any color or brightness in under 7ms than you do by chasseing a higher refresh number
my intrest in this purely ends after 120/144 fps, and the fact people with 2000$ tied up in a cpu and a gpu alone are not going to be using a 1080p monitor, this more paints a picture that makes the older cpu look worthless compared to the newer one rather than showing/telling anyone is it even worth the extra performance.
NOW that said, the number 1 bottleneck you want to avoid is a cpu one, because this is where you get inconstant frame times jumping in, that said im not sure if modern cpus are as affected by this given older cpus hitting a cpu bottleneck would be total system cpu at 100% rather than just a few cores at 100%