The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
Polie Mar 30, 2023 @ 9:38pm
Can anyone explain this difference about CPU usage at 1080p and 4k?
My rig is not that strong anymore:

GTX 1080
i7 6700k stock
16 GB RAM DDR4
Win 10 up to date
Latest Nvidia drivers

At 1080p my CPU is always working 80-100% ( most of the time 95-100%) nonstop on any situation (my shaders are done long ago this is just how it works for me). This makes the game run pretty bad when I reach 95+% of CPU usage. If I manage to have 80% usage or less the game runs at locked 60 fps no problem all the time.

I tried putting the game at 4k and my CPU usage went down to 50% and of course I had barely 28 to 30 fps but my CPU was doing much better than before.

Why is 1080p being more demanding when my GPU at that resolution is only working at 70-80%?

At 4k my GPU is at 99% all the time but my CPU is only 50-60% at any spot of the first level of the game.
< >
Showing 1-9 of 9 comments
CJM Mar 30, 2023 @ 9:45pm 
The higher your frame rate, the more the CPU has to work on stuff like animations, and pathing, and AI and what not. The Graphics settings should show what affects CPU performance, which are the things that get more pronounced at higher frame rates..

You've only got a Quad Core, so you are heavily CPU bottlenecked for a lot of areas in this game.

Let me know if you can run 4k at Bill's Town. After I left the church I barely got 20 FPS, and had to drop down to 480p via FSR just to get playable frame rates.
RefuseRefuse Mar 30, 2023 @ 9:51pm 
It's called bottlenecking. At 1080p you're CPU bound, because your CPU only has 4 not particularly powerful cores (not by today's standards), so it can't keep up with your graphics card.
Not in modern games anyways - I'm pretty sure you're able to run 5-6-7 year old games just fine with a quad core. At 4K you're entirely GPU bound, that's why CPU utilisation drops and GPU usage goes to a hundred.
Last edited by RefuseRefuse; Mar 30, 2023 @ 9:53pm
hawkeye Mar 30, 2023 @ 10:03pm 
Cpu usage is almost identical for different resolutions. But it differs with fps and any effects that are cpu generated.

The cpu generates a list of instructions for the rendering process. The list is the same for any resolution. Rendering is usually lightweight in comparison to the main game pipeline.

The most likely reason why 4k uses less cpu than 1080p is that the fps is lower. Less fps = less number of times per second that the cpu has to produce a frame.

I haven't seen anything in console playthrus that would make TLOU on the PC so cpu demanding. 90-100% usage on a quad core with 8 cpu threads is unusual in my opinion. Especially with a gtx 1080. Another reason is that games at some point or other become single-thread speed dependent which limits cpu usage.

To reduce cpu usage try dropping some graphic effects. Anything particle based like god-rays, mist etc or AA.
Last edited by hawkeye; Mar 30, 2023 @ 10:06pm
Polie Mar 31, 2023 @ 5:28am 
Originally posted by RefuseRefuse:
It's called bottlenecking. At 1080p you're CPU bound, because your CPU only has 4 not particularly powerful cores (not by today's standards), so it can't keep up with your graphics card.
Not in modern games anyways - I'm pretty sure you're able to run 5-6-7 year old games just fine with a quad core. At 4K you're entirely GPU bound, that's why CPU utilisation drops and GPU usage goes to a hundred.
Well.. my experience is quite different talking about recent AAA PC games.

I can run pretty much any game on PC with no problems.

Locked 60 fps at 1080p on a 120 hz monitor: Horizon Zero Dawn, Cyberpunk 2077, Spiderman games, FF7 Remake (120 fps most of the time), RDR2, Returnal, Hogwarts Legacy (of course some stutter in a few zones like the castle but that's pretty common in this game). CPU usage is never above 40-70% and I only depend on my GPU to get good framerates.

This is the first game that makes my old gaming CPU go to 100%.
Polie Mar 31, 2023 @ 5:30am 
Originally posted by hawkeye:
Cpu usage is almost identical for different resolutions. But it differs with fps and any effects that are cpu generated.

The cpu generates a list of instructions for the rendering process. The list is the same for any resolution. Rendering is usually lightweight in comparison to the main game pipeline.

The most likely reason why 4k uses less cpu than 1080p is that the fps is lower. Less fps = less number of times per second that the cpu has to produce a frame.

I haven't seen anything in console playthrus that would make TLOU on the PC so cpu demanding. 90-100% usage on a quad core with 8 cpu threads is unusual in my opinion. Especially with a gtx 1080. Another reason is that games at some point or other become single-thread speed dependent which limits cpu usage.

To reduce cpu usage try dropping some graphic effects. Anything particle based like god-rays, mist etc or AA.
I tried with everythin on Low and is the same CPU usage no differences at all.

I think they just made this game for 6+ core cpu's. And that's weird because I can run any AAA game on PC without problems at 60 fps or more.
CJM Mar 31, 2023 @ 5:38am 
Originally posted by Polie!:
This is the first game that makes my old gaming CPU go to 100%.
Horizon: Zero Dawn was the first game that made my CPU go to 100%. A 3rd Gen Core i7-3770, Quad Core.

I couldn't get past 40 FPS and it baffled me initially. I could scale the resolution up noticeably, but even lowering resolution and graphics barely put a dent in the frame rate. It was like it was capped at 40 FPS. Except that in lighter areas it demonstrably wasn't capped.

Fortunately, Horizon: Zero Dawn had a benchmark utility which had both "GPU frame rate" results, and "CPU frame rate" results.

The Last of Us Part 1 has separate frame rate counters for CPU and GPU available in the game's settings menu. My 8-Core CPU is largely bottlenecking at around 60 FPS, with some settings pushing it down to 50 FPS.

I looked at a Linus Tech Tips video, about the issue which highlighted that the consoles don't have VRAM, just RAM. I suspect that the PC ports may be trying to "emulate" a unified RAM cache of a console, resulting in high demands for CPU throughput.
Tiny Trees Mar 31, 2023 @ 5:49am 
Your CPU is badly out of date for a game like this.

I have 8700K, with two more cores than the 6700K, plus the 8700K has higher IPC and faster cores.

The 8700K maxes out all cores in some games for me. Like RDR2, Horizon: Zero Dawn, Dead Space, and occasionally in AC: Valhalla.

I have not bought The Last of Us. However have seen people running way more powerful CPUs than mine, at 60% to run TLoU.

I have another PC build with a 13700KF that I am building right now. I won't be playing TLoU until I move to that PC. By the looks of it, it will be some time before I would buy TLoU; I will wait for patching.
Last edited by Tiny Trees; Mar 31, 2023 @ 5:51am
i9 1900k and 3090. It uses a lot. GPU is always working becuase of my UW resolution and I've seen CPU usage sit in the 90s no problem. I'm still at the beginning just meeting Tess. RIght after the explosion at Fedra gate I did get an almost 40% drop down to 50%. Overall I just see everything working hard and tell myself, people with less specs...I can definitely see why they are having issues. Then again, shouldn't need a 24GB GPU and an 11th Gen i9 to enjoy this game (unless you want ultrawide resolutions of above 1440p)

I usually stay quiet in forums because my resolution is what hampers my fps> If this would have been purely a gaming PC, I'd have upgraded to the 4090.

Every game released from PS has eventually been made playable and they do update them often from what I've experienced.
Last edited by GlitchQuittersAreSoftasSht; Mar 31, 2023 @ 6:02am
blade5654 Apr 17, 2024 @ 10:05pm 
It probably has less to do with them only having 4 cores which is plenty for gaming from what I understand most games up until recently use a small number of maxed threads instead of the mutlicore power of all cores and threads with dual core being the previous king to quad/hex core being the current and honestly you would want atleast 6 if you only have single threading. It probably has a lot to do with that cpu having severely slow cores and ipc etc for modern day gaming.
< >
Showing 1-9 of 9 comments
Per page: 1530 50

Date Posted: Mar 30, 2023 @ 9:38pm
Posts: 9