The Last of Us™ Part II Remastered

The Last of Us™ Part II Remastered

View Stats:
Way too CPU heavy core wise, not overall utilization wise
This game hogs on all of my cores and my overlay reports over 88% CPU total all core usage.
I have a 13400f and an RX 7600.
This game is by far the most CPU demanding game I have ever seen.
Util wise, it's better than many games. But core count wise, this crap even manages to use my E cores.
< >
Showing 1-15 of 26 comments
Praesi Apr 3 @ 3:14pm 
13900k goes up to 40%...which is unusual high. It uses almost all cores too.
intel is just inefficient. it's bouncing the load from the p to the e cores and all over the place. get an amd cpu. you only need 8 cores in one ccd to run this in native 1440. maybe a lil moe for 4k 60fps. i dunno about that. runs great on my lil moped cpu. lol
Praesi Apr 3 @ 3:35pm 
Originally posted by episoder:
intel is just inefficient.

Yea...no.
MancSoulja (Banned) Apr 3 @ 3:39pm 
Originally posted by Praesi:
Originally posted by episoder:
intel is just inefficient.

Yea...no.

Intel is like on the 5th generation on the same node, there is literally nothing efficient about their CPUs.
Originally posted by episoder:
intel is just inefficient. it's bouncing the load from the p to the e cores and all over the place. get an amd cpu. you only need 8 cores in one ccd to run this in native 1440. maybe a lil moe for 4k 60fps. i dunno about that. runs great on my lil moped cpu. lol
Are you aware of the fact you can select which cores you want any process to use??? It's called task manager and you can set affinity for each process. Also, just saying, AMD has the same exact problem with dual CCD AMD processors, making the Linux gaming experience Hell on the 7950X3D. Because believe it or not, AMD's dual CCD cores selection is done through XBOX Gamebar, an app that doesn't exist on Linux.
Edit: I realized I made a slight mistake. I meant to say select which threads, not cores. Selecting which cores a process can use would be pointless because there are processors with more than 1 thread per core. So yeah. Set affinity to select all the threads you want the process to use.
Last edited by Dexter Morgan; Apr 3 @ 3:42pm
Praesi Apr 3 @ 3:42pm 
Originally posted by Dexter Morgan:
Originally posted by episoder:
intel is just inefficient. it's bouncing the load from the p to the e cores and all over the place. get an amd cpu. you only need 8 cores in one ccd to run this in native 1440. maybe a lil moe for 4k 60fps. i dunno about that. runs great on my lil moped cpu. lol
Are you aware of the fact you can select which cores you want any process to use??? It's called task manager and you can set affinity for each process. Also, just saying, AMD has the same exact problem with dual CCD AMD processors, making the Linux gaming experience Hell on the 7950X3D. Because believe it or not, AMD's dual CCD cores selection is done through XBOX Gamebar, an app that doesn't exist on Linux.

I use Process Lasso. Good Tool.
Samtaro Apr 3 @ 4:04pm 
Another L for the intel users
Originally posted by MancSoulja:
Originally posted by Praesi:

Yea...no.

Intel is like on the 5th generation on the same node, there is literally nothing efficient about their CPUs.


Originally posted by MancSoulja:
Originally posted by Praesi:

Yea...no.

Intel is like on the 5th generation on the same node, there is literally nothing efficient about their CPUs.
True but to be fair they're finally onto "intel 4" which is a 7nm class, IDK how the yields are but it's progress :)
Originally posted by Dexter Morgan:
This game hogs on all of my cores and my overlay reports over 88% CPU total all core usage.
I have a 13400f and an RX 7600.
This game is by far the most CPU demanding game I have ever seen.
Util wise, it's better than many games. But core count wise, this crap even manages to use my E cores.
It's "crap" because it's using resources available to it? Would you rather see 40% free doing nothing?
Originally posted by sinephase:
Originally posted by Dexter Morgan:
This game hogs on all of my cores and my overlay reports over 88% CPU total all core usage.
I have a 13400f and an RX 7600.
This game is by far the most CPU demanding game I have ever seen.
Util wise, it's better than many games. But core count wise, this crap even manages to use my E cores.
It's "crap" because it's using resources available to it? Would you rather see 40% free doing nothing?
You're not taking 1% lows into consideration. The better your CPU is, (the lower your overall thread utilization is) the better 1% lows will be. Because my 13400f gets up to 88% all threads utilization on average, this causes me to have terrible 1% lows in areas where it goes from 15% to 88% CPU usage. It literally drops 12 fps when doing this. This translates as stuttering in the frame time graph.
Originally posted by Dexter Morgan:
Originally posted by sinephase:
It's "crap" because it's using resources available to it? Would you rather see 40% free doing nothing?
You're not taking 1% lows into consideration. The better your CPU is, (the lower your overall thread utilization is) the better 1% lows will be. Because my 13400f gets up to 88% all threads utilization on average, this causes me to have terrible 1% lows in areas where it goes from 15% to 88% CPU usage. It literally drops 12 fps when doing this. This translates as stuttering in the frame time graph.

The devs need to patch it then.
Jarrod Apr 3 @ 4:13pm 
Holy... Do people still not understand that games using more cores is a GOOD thing...?

For years we have dealt with single-threaded video games being locked by single-core performance. Now we have games actually utilizing entire CPUs and people... complain?? Huh??

I'm willing to bet you're talking about while it's compiling shaders as well. Cause that's when people love to complain about core usage and temps lmao
Originally posted by Dexter Morgan:
Originally posted by sinephase:
It's "crap" because it's using resources available to it? Would you rather see 40% free doing nothing?
You're not taking 1% lows into consideration. The better your CPU is, (the lower your overall thread utilization is) the better 1% lows will be. Because my 13400f gets up to 88% all threads utilization on average, this causes me to have terrible 1% lows in areas where it goes from 15% to 88% CPU usage. It literally drops 12 fps when doing this. This translates as stuttering in the frame time graph.
I use about that much % on my CPU and don't have these huge spikes or low FPS I'm assuming because of asset loading, not just % usage. New games take better hardware to load all those gigs of textures smoothly
Originally posted by Grubbs008:
Originally posted by Dexter Morgan:
You're not taking 1% lows into consideration. The better your CPU is, (the lower your overall thread utilization is) the better 1% lows will be. Because my 13400f gets up to 88% all threads utilization on average, this causes me to have terrible 1% lows in areas where it goes from 15% to 88% CPU usage. It literally drops 12 fps when doing this. This translates as stuttering in the frame time graph.

The devs need to patch it then.
You can't patch this. CPU optimization is done by overlaying and doing CPU processes in parallel. This takes between 6 months and a year to do with a studio 800 people big, because you need to account for everything in the rendering pipeline. So we'll see small increments there and there, but it won't get significantly better for another 4-8 months.
Originally posted by sinephase:
Originally posted by Dexter Morgan:
You're not taking 1% lows into consideration. The better your CPU is, (the lower your overall thread utilization is) the better 1% lows will be. Because my 13400f gets up to 88% all threads utilization on average, this causes me to have terrible 1% lows in areas where it goes from 15% to 88% CPU usage. It literally drops 12 fps when doing this. This translates as stuttering in the frame time graph.
I use about that much % on my CPU and don't have these huge spikes or low FPS I'm assuming because of asset loading, not just % usage. New games take better hardware to load all those gigs of textures smoothly
I have to mention I skip my CPU decompression phase by using Direct Storage. The game loads data from the drive straight into the GPU, and uses the GPU for data decompression, before loading everything into VRAM for future rendering. But this shouldn't make a big difference because the CPU does a lot more than just data decompression. But it certainly helps a bit with CPU usage.
< >
Showing 1-15 of 26 comments
Per page: 1530 50