The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
Razin_Shah Apr 6, 2023 @ 2:16am
Digital Foundry vs Hardware Unboxed
Hardware Unboxed: "game runs buttery smooth as long as you have enough vram"

Digital Foundry: "the port has serious issues with cpu performance and memory management "

Which side are you on and more important which of the two is correct in their assessment?
< >
Showing 181-195 of 335 comments
Blazing Storm Apr 7, 2023 @ 6:29am 
Originally posted by MancSoulja:
Originally posted by Blazing Storm:
You are seeing 69% CPU usage in a scene where there is literally nothing going on. There is no AI, no large draw distances, no fancy physics, no ray tracing. Its an empty jungle. You dont see something is wrong here? Why is it using so much of the CPU? And I can guarantee there are times when the 5800X will bottleneck even the 3080 which is just absurd as the 3080 really isn't that fast of a GPU to outpace the 5800X

You need to go back to school and learn how computers work. A CPU isn't responsible for any of those things, well, maybe Ray Tracing but the game doesn't have it.
It does.

Plague Tale Requiem and Cyberpunk - Visit the market areas. It will destroy your CPU because AI is constantly spawning and despawning

Crysis Remastered - Look at the overlook in the beginning of the game. Destroys even modern CPUs to this day because its heavily dependant on single threaded performance which no CPU has enough of.

Physics: Look at Rockstar's RDR2 and GTA 5. CPU usage spikes every time your character falls or does something complex.
CJM Apr 7, 2023 @ 6:33am 
Originally posted by Blazing Storm:
Physics: Look at Rockstar's RDR2 and GTA 5. CPU usage spikes every time your character falls or does something complex.
NVIDIA's PhysX might hold up over time better if NVIDIA supports it long term. I'm assuming PhysX is programmed similarly to how SQL is programmed.

I run Space Engineers, and underneath the VRAGE engine is the Havok Physics engine. A lot of the Physics interactions are single threaded from what I can tell. Makes the game very stuttery.

However, since physics is an extremely important part of a physics sandbox game, it is just something to be accepted on that game.
Blazing Storm Apr 7, 2023 @ 6:39am 
Originally posted by C1REX-PL:
Originally posted by Blazing Storm:

1. Maybe he is an AMD shill? Who cares? He could also be a NVIDIA shill trying to sell people more expensive models with more VRAM, because nobody cares about AMD anyway. In any case, why should we care if he shills for NVIDIA or AMD?

Why are you in this thread then? This thread is a discussion on which video was more objective. HUB video was superficial as I pointed out so by all objective metrics the DF video was better. DF did not bother with the 8GB nonsense. They got to the heart of the issue which is the crappy optimisation of the game while Steve made it a battle between AMD and Nvidia.

2. It's not utilizing 100% of my CPU. I'm GPU bottlenecked at 1440p. DF was also GPU bottlenecked even with a 4090. They had to reduce the resolution to test the CPU bottleneck. However, the 6-core CPU was clearly causing problems.

Neither of the channels played the game long enough. If you get into Pittsburg, the 4090 runs at 70% usage across on a 12900k.

3. HUB didn't perform game analysis, they made a normal benchmark test for GPU bottlenecking. DF created a completely different video with a different goal in mind. Poorly optimized and difficult-to-run games often make for the best benchmarks.

Digital Foundry's purpose was to understand what was wrong with the port because it had so many issues, benchmarking the game was pointless as it would throw out outlier results.

Steve should not have gone ahead with the benchmark because higher end GPUs are being held back by the awful CPU optimisation and there is terrible stuttering on entering new areas. Lower end GPUs can barely run the game properly. The game is allocating 5GB of VRAM for Windows on 4090.

Benchmarking a shoddy game in this manner would throw out irrelevant results which do not reflect the actual performance of the GPUs since the limiting factor is the game and not the GPU.

Regarding your answer to my quote: we will see in a year or two who is right. None of us can predict the future, and we simply interpret trends differently.
None of us can predict the future but we can drawn on objective facts to conclude certain things. The vast majority of the PC market uses Nvidia and is on 8GB cards. Its on the developers to optimise for these constraints and if they do not do this, their games will be tossed in Steam's hall of shame like this game was.
KillingArts Apr 7, 2023 @ 6:50am 
Originally posted by KingGorillaKong:
Originally posted by KillingArts:

It's not meant to be 100% accurate. But it's roughly on the same level. That's all that counts. The 3700X clocks quite a bit higher, so it's debatable whether the 3600X or the 3700X is more appropriate (also debatable how much the 2 missing cores actually matter vs. clock speed).

The 2070s was chosen because it offered a comparable performance in many games before. Makes sense, if you ask me. Showing the experience with 8GB VRAM also makes a lot of sense, since this is what most people have. It's also what Naughty Dog recommends, don't forget that. If the game cannot deliver acceptable performance with the recommended specs - then it's clearly a bad port. No doubt about it. It hasn't been properly optimized for the target platform.
So it's safe to error on caution and under power a rig, without taking into consideration the hardware advantage the PS5 has to begin with despite being so similar?
No that doesn't make sense. You have to accomodate for the fact that the PS5 has GDDR6 memory. Digital Foundry completely ignores this. You have to accomodate for the fact that the PS5 has a direct SSD to GPU performance layer in so it has less system latency. DF did not consider either of these two factors and instead of giving the low end PC a 8 core CPU that is more accurately representative of the PS5 they gave it a worse 6 core processor that alone can't run the same game settings as the PS5 in terms of CPU settings.
And the GPU is nerf'd because you can't match the amount of vRAM consumption on the PS5 because the 2070 is 8GB.

I mean, DF is normally quality content but this comparison they did is butchered and improperly explained. They explain it like they tried to make the low end rig a PS5 PC equivalent but then they use the rig entirely as a low end representation of the game but do very little to actually explain this and their testing conditions. That's a no-no move. How does a giant channel like DF so easily and heavily overlook something as significant as these factors?

You really should stop thinking too much about the specifics of the PS5 architecture. It honestly doesn't matter too much. Because this game is not a PS5 game, it's a PC game. That's the whole point of a port. It's a PC game now. So it should perform accordingly. You guys defend the VRAM hunger by stating the VRAM of the PS5. Instead you should think about what quality of textures other games offer for a certain amount of VRAM. TLOU does not have better textures than other games. Yet on medium (which seemingly already is the maximum you should go for with 8GB) it often looks like a PS3 game (which had 256MB VRAM). Fact is, the game is unnecessarily hungry compared to other PC games. Which means they did not properly optimize the game for PC. Which means it's a bad port.
kgkong Apr 7, 2023 @ 6:51am 
Originally posted by CJM:
Originally posted by KingGorillaKong:
The 5800H isn't a directly comparable CPU to the 5800X though due to the laptop design and limitations. The 5800H also doesn't maintain boost clock for as long nor boost as high.
I've got the HX80G that MinisForum configured as a Mini PC. It has a good Power Supply, and Liquid Metal. It seems to be able to sustain 4.1GHz for an unlimited duration. Extremely quiet too.

Originally posted by KingGorillaKong:
Maybe I should also add I have a weirdly binned 5800X where my out of box max boost is 4.9 to 4.95GHz (depends on how consistent a workload can be applied to the CPU, if it's consistent I'm constantly reporting 4.9 and 4.95GHz core clocks, otherwise I'll watch it jump around between 4.8 to 4.9GHz).
Sustained 4.9GHz does increase the likelihood of being GPU bottlenecked.

Originally posted by KingGorillaKong:
@CJM
No frame cap. I'm on 4k

You'd definitely notice a 100% CPU Usage bottleneck at 720p Low, with the significantly higher frame rate potential of the GPU. Probably notice it at much anything below 4k.
I haven't tried 1080p with TLOU to be fair but I imagine I'd definitely be reaching some kind of issue in there. 720p easily.
Just for giggles I'm gonna go and take a peak at these performance levels on 720p and 1080p respectively. Obviously I'll turn DLSS off at 720p and 1080p. But on 1440p and 4k I have DLSS Quality on.

But the main reason why I play at 4k, high preset with geometry on ultra, all high preset half-resolutions changed to full-resolution is it gives me the most stable frametime experience without sacrificing visual fidelity. The mouse stutter is really the only thing that actually causes me any stutter what-so-ever when I play. Even loading into new scenes, the performance doesn't drop, unless I go from low demanding scene (50-60fps) to a high demanding scene (40-50fps). Even after all the background loading is done, performance doesn't increase back up for me since it didn't decrease in the first place, cause I'm not actually losing real-time gameplay performance on my CPU or GPU to do what it has to finish loading during this process.
Last edited by kgkong; Apr 7, 2023 @ 6:53am
Blazing Storm Apr 7, 2023 @ 6:55am 
Originally posted by KillingArts:
Originally posted by KingGorillaKong:
So it's safe to error on caution and under power a rig, without taking into consideration the hardware advantage the PS5 has to begin with despite being so similar?
No that doesn't make sense. You have to accomodate for the fact that the PS5 has GDDR6 memory. Digital Foundry completely ignores this. You have to accomodate for the fact that the PS5 has a direct SSD to GPU performance layer in so it has less system latency. DF did not consider either of these two factors and instead of giving the low end PC a 8 core CPU that is more accurately representative of the PS5 they gave it a worse 6 core processor that alone can't run the same game settings as the PS5 in terms of CPU settings.
And the GPU is nerf'd because you can't match the amount of vRAM consumption on the PS5 because the 2070 is 8GB.

I mean, DF is normally quality content but this comparison they did is butchered and improperly explained. They explain it like they tried to make the low end rig a PS5 PC equivalent but then they use the rig entirely as a low end representation of the game but do very little to actually explain this and their testing conditions. That's a no-no move. How does a giant channel like DF so easily and heavily overlook something as significant as these factors?

You really should stop thinking too much about the specifics of the PS5 architecture. It honestly doesn't matter too much. Because this game is not a PS5 game, it's a PC game. That's the whole point of a port. It's a PC game now. So it should perform accordingly. You guys defend the VRAM hunger by stating the VRAM of the PS5. Instead you should think about what quality of textures other games offer for a certain amount of VRAM. TLOU does not have better textures than other games. Yet on medium (which seemingly already is the maximum you should go for with 8GB) it often looks like a PS3 game (which had 256MB VRAM). Fact is, the game is unnecessarily hungry compared to other PC games. Which means they did not properly optimize the game for PC. Which means it's a bad port.
Exactly what I was trying to point out to C1Rex.

It is irrelevant that the game is developed on PS5 as a baseline. When you are going to port it to PC, it needs to meet the standards and requirements of the PC market of which the majority are on 8GB cards. The PC market is not going to sell off their PCs and buy higher end GPUs to make the lives of the devs easier. What's likely going to happen is they will destroy the game's review score, toss it in Steam's Hall of Shame with only higher end GPU buyers buying the game or AMD users, which aren't many.

If you are going to develop for PC, do it right or don't port at all.

The medium textures look like a game from the 2000 era which is unacceptable if you look at beautiful games like Plague Tale Requiem which do run on 8GB cards
Last edited by Blazing Storm; Apr 7, 2023 @ 6:56am
kgkong Apr 7, 2023 @ 7:05am 
Originally posted by KingGorillaKong:
Originally posted by CJM:
I've got the HX80G that MinisForum configured as a Mini PC. It has a good Power Supply, and Liquid Metal. It seems to be able to sustain 4.1GHz for an unlimited duration. Extremely quiet too.


Sustained 4.9GHz does increase the likelihood of being GPU bottlenecked.



You'd definitely notice a 100% CPU Usage bottleneck at 720p Low, with the significantly higher frame rate potential of the GPU. Probably notice it at much anything below 4k.
I haven't tried 1080p with TLOU to be fair but I imagine I'd definitely be reaching some kind of issue in there. 720p easily.
Just for giggles I'm gonna go and take a peak at these performance levels on 720p and 1080p respectively. Obviously I'll turn DLSS off at 720p and 1080p. But on 1440p and 4k I have DLSS Quality on.

But the main reason why I play at 4k, high preset with geometry on ultra, all high preset half-resolutions changed to full-resolution is it gives me the most stable frametime experience without sacrificing visual fidelity. The mouse stutter is really the only thing that actually causes me any stutter what-so-ever when I play. Even loading into new scenes, the performance doesn't drop, unless I go from low demanding scene (50-60fps) to a high demanding scene (40-50fps). Even after all the background loading is done, performance doesn't increase back up for me since it didn't decrease in the first place, cause I'm not actually losing real-time gameplay performance on my CPU or GPU to do what it has to finish loading during this process.
I don't hit a CPU bottleneck until I go down to 1080p and even then, 50-55% CPU usage and it's not enough to properly bottleneck the game.
Adjust all settings to high preset, same performance and same stutter. Though with geoemetry turned to high, CPU bottleneck actually lasts long enough to visibly notice it with peak CPU usage 60% and lowest GPU usage 90%.

FPS cap I get without turning graphic settings down to force high FPS is 90 and averages at 78 with 1% lows hitting 25. Why would I play on 1080p with that bad of performance various because CPU begins to hit the bottleneck there, when on 4k and targeting a near-60fps experience gives me fluid frametime and (buttery) smooth performance?

I can't replicate these 100% CPU usage situations people complain about unless I go into my BIOS and actually reduce the power limits so it won't ever use more than 95W. But I have it unlocked because my motherboard supports the peak total watts this CPU can draw without an overclock.

A lot of these CPU 100% usage issues are likely just a combination of game performance issues from lack of optimization but also from potentially bad motherboard-CPU pairings or just bad BIOS optimizations for the CPU installed. Very common when you have a low price motherboard paired with a higher end CPU.
KillingArts Apr 7, 2023 @ 7:09am 
Originally posted by C1REX-PL:
I watched a Digital Foundry video about Returnal, and I own the game, so I have my own conclusions about it. I've also seen how other people perceive its performance. The game has received tons of bad reviews and negative posts on forums. Even though my 5700XT should have similar power to the PS5 GPU, I didn't feel like it performed as well as the PS5.

It got way, way more positive reviews than negative ones, which is not something we can say about TLOU, which started out with "mostly negative" reviews just over 30% and is still very low (47%). Returnal is at over 80%.

The difference between TLOU and Returnal is that the latter can scale really well and still deliver nice textures even for cards with significantly less VRAM than the PS5.

Originally posted by C1REX-PL:
On the other hand, Digital Foundry claimed that Elden Ring is unplayable on PC, but it runs perfectly fine for me.

Maybe after some months of patching, sure. At release it was quite a mess. For everyone. And it's still lacking essential PC features, to be honest. I wouldn't quite call that initial state unplayable (I mean, I played it after all). And I'm not sure DF did really call it that. But it was pretty bad nonetheless.

Originally posted by C1REX-PL:
I don't defend The Last of Us as a greatly optimized port. I have no idea if any other studio could have done a better job. However, I do defend it as the new normal and a trend, rather than an exception.

Seeing that we have countless other games that offer similar or better visuals with less VRAM and a lot less CPU load, I'm certain we could have gotten a better port.
C1REX Apr 7, 2023 @ 7:17am 
Originally posted by Blazing Storm:
Why are you in this thread then? This thread is a discussion on which video was more objective. HUB video was superficial as I pointed out so by all objective metrics the DF video was better. DF did not bother with the 8GB nonsense. They got to the heart of the issue which is the crappy optimisation of the game while Steve made it a battle between AMD and Nvidia.
HUB made a simple GPU test for a specific arguably broken game, and he did it by removing CPU bottleneck as it should be done. He tested what GPU is able to run this allegedly broken game. Nvidia won the test. Even if he is the biggest shill, the numbers do not lie.

DF played on three systems and compared differences in look and performance. More people were likely looking for that information, which is why more people liked it. What I dislike about the video is that it doesn't show what kind of PC is required to run the game like on the PS5. What's even worse is that the way they conducted the analysis made people believe that the PC version looks worse or is broken.

Originally posted by Blazing Storm:
Neither of the channels played the game long enough. If you get into Pittsburg, the 4090 runs at 70% usage across on a 12900k.
Interesting fact. Flight simulator is also heavily CPU bottlenecked even at 4K native when using 4090. So this game should be a good benchmark for CPU and GPU.

Originally posted by Blazing Storm:
Digital Foundry's purpose was to understand what was wrong with the port because it had so many issues, benchmarking the game was pointless as it would throw out outlier results.

Steve should not have gone ahead with the benchmark because higher end GPUs are being held back by the awful CPU optimisation and there is terrible stuttering on entering new areas. Lower end GPUs can barely run the game properly. The game is allocating 5GB of VRAM for Windows on 4090.

Benchmarking a shoddy game in this manner would throw out irrelevant results which do not reflect the actual performance of the GPUs since the limiting factor is the game and not the GPU.
In my opinion, it's quite the opposite. Cyberpunk is likely the most popular GPU benchmark. HUB made a dry benchmark, while DF performed game analysis. These are two completely different videos that do not contradict each other

Originally posted by Blazing Storm:
None of us can predict the future but we can drawn on objective facts to conclude certain things. The vast majority of the PC market uses Nvidia and is on 8GB cards. Its on the developers to optimise for these constraints and if they do not do this, their games will be tossed in Steam's hall of shame like this game was.
I see your point and maybe you are right. I think however we will have a repeat from ps4 ports when almost overnight 4GB wasn't enough for many games.
Last edited by C1REX; Apr 7, 2023 @ 7:24am
Marty McFly Apr 7, 2023 @ 7:25am 
We should be wary of trusting "chills" who are paid to do damage control for bugged and bad optimized games on behalf of companies. These individuals may prioritize protecting the company's reputation over addressing valid criticisms and feedback from the gaming community, which can be harmful to players who rely on accurate information to make informed decisions. It's crucial to seek out multiple sources of information, including independent reviews and feedback from other players, when deciding whether to purchase a game. Companies should focus on creating high-quality games that don't require the use of "chills" to defend their reputation, rather than relying on deceptive marketing tactics.
C1REX Apr 7, 2023 @ 7:27am 
Originally posted by Marty McFly:
We should be wary of trusting "chills" who are paid to do damage control for bugged and bad optimized games on behalf of companies. These individuals may prioritize protecting the company's reputation over addressing valid criticisms and feedback from the gaming community, which can be harmful to players who rely on accurate information to make informed decisions. It's crucial to seek out multiple sources of information, including independent reviews and feedback from other players, when deciding whether to purchase a game. Companies should focus on creating high-quality games that don't require the use of "chills" to defend their reputation, rather than relying on deceptive marketing tactics.
Are you talking abut HUB who shows in numbers that NVIDIA is the best to run this game?
CJM Apr 7, 2023 @ 7:29am 
Originally posted by KingGorillaKong:
with peak CPU usage 60% and lowest GPU usage 90%.

Originally posted by KingGorillaKong:
without turning graphic settings down to force high FPS

That is not the objective of the exercise.

The objective is to intentionally turn graphics settings down to force high FPS. Run at about 30% to 50% GPU usage, and 100 CPU Usage. This is not applicable to all areas.

AI is reportedly key to hitting the 100% CPU Usage.

Originally posted by KingGorillaKong:
Why would I play on 1080p with that bad of performance various because CPU begins to hit the bottleneck there, when on 4k and targeting a near-60fps experience gives me fluid frametime and (buttery) smooth performance?
You wouldn't.

The question is why others would.

You can achieve 50-60 FPS with 8-Cores at 4.9GHz sustained. That is nearly 2+ more cores than my 8-Core 4.1GHz sustained, a 25% performance increase.

4.1GHz * 8-Cores = 32.8GHz of available compute capacity. = 40 FPS.

4.9GHz * 8-Cores = 39.2GHz of available compute capacity. = 50 FPS (1.25 times 40).

Intel has been tossing out Quad Cores for a decade, We're just now entering into the 8-Core era.

3.6GHz to 3.8GHz per core is the baseline. "Turbo" is not how one advertises a supported frequency. Requiring Sustained frequencies of 4.9GHz on an 8-Core CPU for roughly 40GHz of compute for 60 FPS is very heavy on the system requirements.

Originally posted by KingGorillaKong:
I can't replicate these 100% CPU usage
...
kgkong Apr 7, 2023 @ 7:34am 
Originally posted by CJM:
Originally posted by KingGorillaKong:
with peak CPU usage 60% and lowest GPU usage 90%.

Originally posted by KingGorillaKong:
without turning graphic settings down to force high FPS

That is not the objective of the exercise.

The objective is to intentionally turn graphics settings down to force high FPS. Run at about 30% to 50% GPU usage, and 100 CPU Usage. This is not applicable to all areas.

AI is reportedly key to hitting the 100% CPU Usage.

Originally posted by KingGorillaKong:
Why would I play on 1080p with that bad of performance various because CPU begins to hit the bottleneck there, when on 4k and targeting a near-60fps experience gives me fluid frametime and (buttery) smooth performance?
You wouldn't.

The question is why others would.

You can achieve 50-60 FPS with 8-Cores at 4.9GHz sustained. That is nearly 2+ more cores than my 8-Core 4.1GHz sustained, a 25% performance increase.

4.1GHz * 8-Cores = 32.8GHz of available compute capacity. = 40 FPS.

4.9GHz * 8-Cores = 39.2GHz of available compute capacity. = 50 FPS (1.25 times 40).

Intel has been tossing out Quad Cores for a decade, We're just now entering into the 8-Core era.

3.6GHz to 3.8GHz per core is the baseline. "Turbo" is not how one advertises a supported frequency. Requiring Sustained frequencies of 4.9GHz on an 8-Core CPU for roughly 40GHz of compute for 60 FPS is very heavy on the system requirements.

Originally posted by KingGorillaKong:
I can't replicate these 100% CPU usage
...
The nature of my test wasn't to prove that the CPU bottleneck exists at that extreme. I already know that and there's ample examples (as when you look at a CPU benchmark does just that for you). Mine was to show that I cannot get the CPU usage issue with playable and optimized set settings without setting power limits and other limits on the CPU through the motherboard BIOS.
Last edited by kgkong; Apr 7, 2023 @ 7:35am
CJM Apr 7, 2023 @ 7:48am 
Originally posted by KingGorillaKong:
Mine was to show that I cannot get the CPU usage issue with playable and optimized set settings without setting power limits and other limits on the CPU through the motherboard BIOS.
Well, I definitely can, and it made for a rough experience with the game.

It was exacerbated when adjusting settings by not having a "Please Wait" on apply causing additional performance impact.

I bought the game with the expectations of 720p/60/Medium being an enjoyable experience. It was not worth a pre-order at full price.
kgkong Apr 7, 2023 @ 7:55am 
Originally posted by CJM:
Originally posted by KingGorillaKong:
Mine was to show that I cannot get the CPU usage issue with playable and optimized set settings without setting power limits and other limits on the CPU through the motherboard BIOS.
Well, I definitely can, and it made for a rough experience with the game.

It was exacerbated when adjusting settings by not having a "Please Wait" on apply causing additional performance impact.

I bought the game with the expectations of 720p/60/Medium being an enjoyable experience. It was not worth a pre-order at full price.
So this bears asking, what CPU and motherboard do you have?
< >
Showing 181-195 of 335 comments
Per page: 1530 50

Date Posted: Apr 6, 2023 @ 2:16am
Posts: 336