The Last of Us™ Parte I

The Last of Us™ Parte I

Ver estadísticas:
Game is dropping frames despite not maxing out any of my hardware
Hey all, as the title suggests I'm playing with the hardware monitoring options turned on so I can see my CPU usage, GPU usage, VRAM usage and FPS counter. Usually I'm sitting at 70-80% CPU usage, 50-60% GPU usage, around 75% VRAM usage, and while my FPS counter CLAIMS the game is running at 58-60 fps it feels super jittery not smooth at all when I turn the camera which makes it hard to aim and also just ruins the immersion.

Any tips? I know the game is poorly optimised and there are lots of issues but there might be something obvious that I'm missing? Thanks for any help you can offer!

My hardware: i7 9700KF, RTX 4080 w/16 GB VRAM, 32 GB of RAM, playing on an SSD
< >
Mostrando 31-45 de 74 comentarios
Jeff4u 3 ABR 2023 a las 6:31 
It is CPU that the game is not optimized. In addition, when you enter in new area, you will find the CPU is stressed for VRAM streaming also make your frame rate dropped. This will be recovered in 10-20 sec after the game loaded the area and the frame rate will be stabilized a bit (can be from 40 back to 60 in my cases).

This game uses 100GB installed data and not sure why it use so much CPU power to decompress the data in game loading.
Última edición por Jeff4u; 3 ABR 2023 a las 6:42
CJM 3 ABR 2023 a las 6:32 
Publicado originalmente por KingGorillaKong:
There's a memory leak and this will impact performance over time. There's likely a few and they're not all activating equally across all gamers.

Definitely possible.

Publicado originalmente por KingGorillaKong:
And the Hotel and Bill's town are both rather CPU extensive scenes in the game. I wouldn't expect your CPU to get anywhere near the performance of mine (I have the desktop version of your CPU) but I never had these issues.
I'm having more GPU issues there than CPU issues in those areas. I'm not claiming it is a super GPU,... I keep the CPU Frame Rate and GPU Frame Rate counters active, so when it hits 30 FPS I can see what to tune. Hence why I wanted to lower my settings.

Lowering graphics settings just seems to exacerbate the issue. Like VRAM isn't getting flushed.

Could be asset streaming also. I wonder if asset streaming is saturating the GPU pipeline causing the GPU frame rate to drop?

Publicado originalmente por KingGorillaKong:
The 1% lows it caused for me was a microstutter. Again, having extra vRAM, system memory and just a more capable CPU goes a long way here.
VRAM microstutter,... Not what I am experiencing. This isn't stutter, this is 99% averages of 30 FPS.

Memory leak is a suspect. I don't remember ever seeing the game max out 32GB, but it may have an internal DRAM cap?

Publicado originalmente por KingGorillaKong:
But you may need to ramp up your page file on that system you got to help compensate for these memory leaks and just overall larger file and asset sizes.
Do you need glasses? I've got 32GB of RAM. I don't need a bigger page file, I just need to exit to the desktop and relaunch the game after each chapter.
kgkong 3 ABR 2023 a las 6:42 
Since lowering the actual graphic settings is making the issue worse, it shows that the issue isn't actually GPU limiting factor. Even if vRAM is a limiting factor, memory spillover is managed by the CPU once the GPU hands it off the system memory.
Also, the game is just CPU heavy to begin with. Attempting to lower graphic settings increases frame rate counts which actually puts more workload on the CPU which is why the issue is getting worse with lower graphical settings. You wanna take as much of the workload off of the CPU as possible. Max graphic settings your GPU can handle within spec. Then turn down the game settings that pertain specifically to CPU. Higher resolutions can help a lot with CPU performance as it forces your GPU to work harder and produce less frames resulting in more headroom for the CPU to work on CPU bound tasks.
CJM 3 ABR 2023 a las 6:48 
Publicado originalmente por KingGorillaKong:
Since lowering the actual graphic settings is making the issue worse, it shows that the issue isn't actually GPU limiting factor.
Makes sense. First indicator for troubleshooting points to the GPU. CPU wasn't saturated.

Publicado originalmente por KingGorillaKong:
Even if vRAM is a limiting factor, memory spillover is managed by the CPU once the GPU hands it off the system memory.
Also, the game is just CPU heavy to begin with. Attempting to lower graphic settings increases frame rate counts which actually puts more workload on the CPU which is why the issue is getting worse with lower graphical settings. You wanna take as much of the workload off of the CPU as possible.
Agreed. CPU just wasn't 100%. So I'm investigating other possible causes. OP had the same issue so I'm attempting to collaborate.

Publicado originalmente por KingGorillaKong:
Max graphic settings your GPU can handle within spec. Then turn down the game settings that pertain specifically to CPU. Higher resolutions can help a lot with CPU performance as it forces your GPU to work harder and produce less frames resulting in more headroom for the CPU to work on CPU bound tasks.

Basic The Last of Us Optimization. I'm past that.

The OP and I both have PCI-Express 3.0 bus speeds. I'm presuming that my RX 6600 discreet GPU is limited to that PCI-Express 3.0 x8 bus.

Desktop class PCI-Express 3.0 x16 buses, and the PCI-Express 4.0 x8 may not typically be seeing this issue.

OP has a 4090 on a CPU with a PCI-Express 3.0 x16 bus, which means Ultra assets saturating that bus... ???
kgkong 3 ABR 2023 a las 6:55 
It's possible but the performance drop to PCIe gen 3 x16 shouldn't be that brutal on a 40 series GPU.
PCIe gen 3 x8 is also a potential issue here, and the CPU may not be able to in a timely manner load everything up to the GPU with how the game engine is designed. Gotta keep in mind, as a port, there's some things that will be heavily limited in terms of optimizations without the devs rebuilding the entire PC client to change how the game engine pipeline works. So some otherwise reasonable optimization efforts won't be so easy to introduce to fix the performance issues on lower game settings for these particular types of hardware configurations. All the work that goes into a PC preparing everything that the GPU needs to do takes time on a PC where the PS5 just doesn't have those latency bottlenecks because of the architectural design of the console hardware as a whole. Without nerfing actual asset sizes and file sizes, I'm not sure what all can actually be done to reduce the impact of these particular bottlenecks.

Realistically, it's unoptimal to put a PCIe Gen 4 device in a PC that ends up using it at lower PCIe versions. Was a pretty common feature with a lot of 30 series GPUs when people first upgraded and it wasn't necessarily something that was actually showing any real performance differences because of the way game workloads were up until quite recently. So from a technical standpoint, if these are the challenges you have, you'll either have to figure out what settings to sacrifice in favor of performance until you eventually decide to upgrade your parts or build a new PC that is optimal for the hardware pairing. Unfortunately, most prebuilt systems don't actually take this into consideration with hardware around the end of PCIe Gen 3 and the beginning of PCIe Gen 4.
Elon's Musk 3 ABR 2023 a las 7:00 
Publicado originalmente por KingGorillaKong:
It's possible but the performance drop to PCIe gen 3 x16 shouldn't be that brutal on a 40 series GPU.
PCIe gen 3 x8 is also a potential issue here, and the CPU may not be able to in a timely manner load everything up to the GPU with how the game engine is designed. Gotta keep in mind, as a port, there's some things that will be heavily limited in terms of optimizations without the devs rebuilding the entire PC client to change how the game engine pipeline works. So some otherwise reasonable optimization efforts won't be so easy to introduce to fix the performance issues on lower game settings for these particular types of hardware configurations. All the work that goes into a PC preparing everything that the GPU needs to do takes time on a PC where the PS5 just doesn't have those latency bottlenecks because of the architectural design of the console hardware as a whole. Without nerfing actual asset sizes and file sizes, I'm not sure what all can actually be done to reduce the impact of these particular bottlenecks.

Realistically, it's unoptimal to put a PCIe Gen 4 device in a PC that ends up using it at lower PCIe versions. Was a pretty common feature with a lot of 30 series GPUs when people first upgraded and it wasn't necessarily something that was actually showing any real performance differences because of the way game workloads were up until quite recently. So from a technical standpoint, if these are the challenges you have, you'll either have to figure out what settings to sacrifice in favor of performance until you eventually decide to upgrade your parts or build a new PC that is optimal for the hardware pairing. Unfortunately, most prebuilt systems don't actually take this into consideration with hardware around the end of PCIe Gen 3 and the beginning of PCIe Gen 4.
♥♥♥♥♥. The issue is not PC hardware. This port is hot garbage.
My fps is fine in 1440p @ ultra but it pushes my 5950x and 3090 way too hard for the level of visual quality. The game lacks a crisp look and on top of that the way it handles textures is really bad. I notice some odd texture flickering going on. Its on a 980 pro nvme and I have 64gb of ram.
On top of that the way the camera moves with the mouse makes the game actually feel laggier than it is. Really strange. Glad I didnt pay for it ♥♥♥♥♥
CJM 3 ABR 2023 a las 7:05 
Publicado originalmente por KingGorillaKong:
It's possible but the performance drop to PCIe gen 3 x16 shouldn't be that brutal on a 40 series GPU.
I was thinking the same thing about PCIe gen 3 x8 for the RX 6600 line, given that all of them are only x8.

Publicado originalmente por KingGorillaKong:
PCIe gen 3 x8 is also a potential issue here, and the CPU may not be able to in a timely manner load everything up to the GPU with how the game engine is designed. ...where the PS5 just doesn't have those latency bottlenecks because of the architectural design of the console hardware as a whole.
Yeah, that is where my gut is sitting at the moment.
kgkong 3 ABR 2023 a las 7:10 
Publicado originalmente por Mr. Pink:
Publicado originalmente por KingGorillaKong:
It's possible but the performance drop to PCIe gen 3 x16 shouldn't be that brutal on a 40 series GPU.
PCIe gen 3 x8 is also a potential issue here, and the CPU may not be able to in a timely manner load everything up to the GPU with how the game engine is designed. Gotta keep in mind, as a port, there's some things that will be heavily limited in terms of optimizations without the devs rebuilding the entire PC client to change how the game engine pipeline works. So some otherwise reasonable optimization efforts won't be so easy to introduce to fix the performance issues on lower game settings for these particular types of hardware configurations. All the work that goes into a PC preparing everything that the GPU needs to do takes time on a PC where the PS5 just doesn't have those latency bottlenecks because of the architectural design of the console hardware as a whole. Without nerfing actual asset sizes and file sizes, I'm not sure what all can actually be done to reduce the impact of these particular bottlenecks.

Realistically, it's unoptimal to put a PCIe Gen 4 device in a PC that ends up using it at lower PCIe versions. Was a pretty common feature with a lot of 30 series GPUs when people first upgraded and it wasn't necessarily something that was actually showing any real performance differences because of the way game workloads were up until quite recently. So from a technical standpoint, if these are the challenges you have, you'll either have to figure out what settings to sacrifice in favor of performance until you eventually decide to upgrade your parts or build a new PC that is optimal for the hardware pairing. Unfortunately, most prebuilt systems don't actually take this into consideration with hardware around the end of PCIe Gen 3 and the beginning of PCIe Gen 4.
♥♥♥♥♥. The issue is not PC hardware. This port is hot garbage.
My fps is fine in 1440p @ ultra but it pushes my 5950x and 3090 way too hard for the level of visual quality. The game lacks a crisp look and on top of that the way it handles textures is really bad. I notice some odd texture flickering going on. Its on a 980 pro nvme and I have 64gb of ram.
On top of that the way the camera moves with the mouse makes the game actually feel laggier than it is. Really strange. Glad I didnt pay for it ♥♥♥♥♥
Did you update the firmware on your 980 Pro? There's a known issue where these Samsung SSDs will slowly just kill themselves and degrade health and quality of performance over time due to the faulty firmware it was released with.
You technically have better hardware than me, yet you're not getting as good of performance as me (I'm playing on 4k myself with really great frame rate and frame time, 5800X, 48GB DDR4 3200 RAM, 12GB 3060, WD Blue NVMe SSD storage). It sounds like there's an issue causing a significant slowdown in your performance and it might actually be due to the SSD you have.

I'm also not denying the game doesn't have optimization it can use to improve performance for many users. Just in your specific situation, you might have the firmware bug with your SSD, or you might also have the nVidia driver bug that is screwing with how the GPU is rendering in the game (which can also get as bad as causing the game to randomly crash).
Dr. Peppermill 3 ABR 2023 a las 11:09 
Publicado originalmente por KingGorillaKong:
Well... The game engine is designed around multi-threaded processors. A lot of people trying to play TLOU still have single threaded CPUs.
WTF are you talking about ? There is not such thing as “single threaded CPU” manufactured in the last 20 years. Are you suggesting people play this game on Pentium CPUs ? 🤡
kgkong 3 ABR 2023 a las 11:15 
2
Publicado originalmente por Dr. Peppermill:
Publicado originalmente por KingGorillaKong:
Well... The game engine is designed around multi-threaded processors. A lot of people trying to play TLOU still have single threaded CPUs.
WTF are you talking about ? There is not such thing as “single threaded CPU” manufactured in the last 20 years. Are you suggesting people play this game on Pentium CPUs ? 🤡
If you have an 8 core CPU and it has 8 threads, it's single thread. No hyper or multithreading.
There's a lot of people still using a fair number of older Intel CPUs that are like that, a single thread for each core, and these CPUs have a dramatic enough drop off in performance compared to multithreaded CPUs, as in a CPU that has 2 threads for each core.
Dr. Peppermill 3 ABR 2023 a las 11:25 
Publicado originalmente por Mr. Pink:
My fps is fine in 1440p @ ultra but it pushes my 5950x and 3090 way too hard for the level of visual quality. The game lacks a crisp look and on top of that the way it handles textures is really bad. I notice some odd texture flickering going on. Its on a 980 pro nvme and I have 64gb of ram.

There should be something else. I’m with a weaker CPU, 5900x, and its usage stays under 5%. The thing is I’m playing at 4K native, which pushes the GPU to its limits and if there is a bottleneck it’s the GPU, not the CPU.
About the “crisp look”, did you remember to disable DOF in settings ?
BTW I haven’t seen a single texture flickering so far (I’m in the middle of Bil’s Town).

I’m not defending the game because I don’t have major issues, I guess it’s a crappy port if so many people are complaining.
CJM 3 ABR 2023 a las 11:26 
Publicado originalmente por KingGorillaKong:
If you have an 8 core CPU and it has 8 threads, it's single thread. No hyper or multithreading.
Nice recovery.

You can understand the confusion we both had. "single threaded CPU" sounds like a Pentium III.
kgkong 3 ABR 2023 a las 11:27 
Publicado originalmente por CJM:
Publicado originalmente por KingGorillaKong:
If you have an 8 core CPU and it has 8 threads, it's single thread. No hyper or multithreading.
Nice recovery.

You can understand the confusion we both had. "single threaded CPU" sounds like a Pentium III.
My bad, didn't have that archaic of hardware in mind while typing out that comment. XD Gosh darn those Pentiums though!
PRAET0R1AN™ 3 ABR 2023 a las 11:39 
Those saying it's his cpu it's not, his cpu meets the performance spec listed on the infographic image that naughtydog posted.

They have confirmed potential memory leaks, the game is running on a DX12 wrapper and not natively on DX12 ( it's emulated DX12 ) and has huge cpu overhead.

Hotfix coming tomorrow with a larger patch dropping afterwards.

This is something op cannot fix himself no matter what graphics options he changes, This can only be fixed via patching to make it run natively on DX12 which means rebuilding the entire render pipeline and better memory management.
Dr. Peppermill 3 ABR 2023 a las 11:45 
Publicado originalmente por KingGorillaKong:
If you have an 8 core CPU and it has 8 threads, it's single thread. No hyper or multithreading.
There's a lot of people still using a fair number of older Intel CPUs that are like that, a single thread for each core, and these CPUs have a dramatic enough drop off in performance compared to multithreaded CPUs, as in a CPU that has 2 threads for each core.

Stop making a fool of yourself, this is pathetic. 8 threads from 8 cores is much better than 8 threads from 4 cores (your understanding of multi-threading).

Few posts up you'll see a guy who has a CPU with 16 cores / 32 threads (5950x). And he is complaining ... Do you think PS4 and PS5 have better CPUs ?
< >
Mostrando 31-45 de 74 comentarios
Por página: 1530 50

Publicado el: 2 ABR 2023 a las 21:41
Mensajes: 74