The Last of Us™ Part II Remastered

The Last of Us™ Part II Remastered

View Stats:
♥♥♥♥♥♥♥♥♥, the optimization.....
i have an rx7900xtx & i know some triple A games that ARE NOT optimized in this day & age, but ♥♥♥♥♥♥♥♥. i don't even need to enable frame generation either to get 100fps with literally everything maxed out.

i just got the game & im about 20 minutes in so far. great work dev team!
< >
Showing 1-15 of 27 comments
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.
Zephyr May 8 @ 8:57am 
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.
Well, since most AAA game coming out now with UE 5 run absolutely terrible and do not really look better than this game (most of them anyway) I also have to say THIS is good :). We all take what we can get.

Really, I would be absolutely happy if all new games looked more or less as good like TLOU 2 AND had its performance.

TLOU2 can be run at 4K native (!) 60+ (actually often more like 70 FPS) with just an RX 7900XT class GPU (GPU limit, OP should have told us the resolution also, otherwise you cannot say anything about scaling). None of the new UE 5 games come even close, and again, usually do not look any better, even natively... .

So, yeah..., it FEELS good for a change even if it might scale badly from PS5 (no idea there, never had a console).
Last edited by Zephyr; May 8 @ 9:37am
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.
and you think the PS5 runs games well do you? lmfao
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.

that is not true. to get those kinda fps you need 2x the cpu power of the ps5 to get there. this is the first "bottleneck". the ps5 has 8 cores and 16 threads at 3.5 Ghz to get average 90 fps in performance mode in moderate graphically intensive scenes with moderate npc ai. when more of that ai hits, you look at lower performance, cause the ai and character movement gotta be computed. once you get that scaling to perform at the fps you want, you can think about the rest of the graphics processing. and well... gpus have limitations too. not just teraflops to compute, but also fillrate, memory speed and raster output. there's alot of variables that differ from the ps5 gpu. high end pc gpus are generally more powerful, but you also need the cpu to run the game code and graphics setup. the ps5 does that very quickly cause everything happens on one chip. you seriously underestimate the architecture difference.
Last edited by episoder; May 8 @ 9:32am
Admiral May 8 @ 12:07pm 
Originally posted by episoder:
that is not true. to get those kinda fps you need 2x the cpu power of the ps5 to get there.

PS5 is a closed system and games can be optimized really well for it. You can't compare it to PC ports and PC's. If PC were 1:1 with ps5 in this regard, you'd get the ps5 performance out of crappy old parts like 3600x and 2070. People always forget this.
Last edited by Admiral; May 8 @ 12:10pm
Zephyr May 8 @ 12:56pm 
Originally posted by episoder:
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.

that is not true. to get those kinda fps you need 2x the cpu power of the ps5 to get there. this is the first "bottleneck". the ps5 has 8 cores and 16 threads at 3.5 Ghz to get average 90 fps in performance mode in moderate graphically intensive scenes with moderate npc ai. when more of that ai hits, you look at lower performance, cause the ai and character movement gotta be computed. once you get that scaling to perform at the fps you want, you can think about the rest of the graphics processing. and well... gpus have limitations too. not just teraflops to compute, but also fillrate, memory speed and raster output. there's alot of variables that differ from the ps5 gpu. high end pc gpus are generally more powerful, but you also need the cpu to run the game code and graphics setup. the ps5 does that very quickly cause everything happens on one chip. you seriously underestimate the architecture difference.

Maybe I get something wrong somewhere..., but whenever I look into it the PS5 (normal one) has 30 FPS 4K mode or performance mode 60 FPS 1440P as standard. Then there is an unlocked FPS mode that seems to give you maybe up to 90 FPS for 1440P performance mode for TVs that support higher refresh rates (what quality settings is this?). The PS5 (pro) gives you the PSSR upscaling option on top+better RT and otherwise not that much extra for performance. The PS5 Pro GPU is often compared to an RX 7700XT, maybe RX 6800 XT and in the best case an RX 7800 XT (not really likely though) as for what it should be capable of when compared directly hardware-hardware without other considerations.

https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html

From this source you can compare what type of GPU the PS5 (Pro) performance equals to for TLOU2 on PC in reality.

For 1440P native:
Here we land at a RX 7900 GRE-RTX 4070 super type GPU when using 90 FPS 1440 P as a benchmark. Yet these GPUs manage the 90 FPS 1440P with the full settings. I am not sure if the PS5 (Pro) does this as well with the 90 FPS perfromance note. No idea.

For 4K native:
If we assume that a PS5 (Pro) can manage 45 FPS 4K then we get again a RX 6800 XT as equivalent GPU for TLOU2 full settings on PC.

If we sum this up the scaling, just for the GPU is not fully there maybe, but still not that bad. Certainly not off by enough to call it badly optimzed, actually. The game gets to higher refresh rates of 85-90 FPS FPS 1440P, full settings on GPUs that are either higher end, but 5 years old or midrange GPUs for 500-600 $ from the last 2-3 years. I mean, that is not bad by any means.
Last edited by Zephyr; May 8 @ 2:09pm
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.
i get above 150fps with frame generation, but I've been getting minimum 100fps without it & i've been running at 4k. the PS5 is nothing compared to raw performance of a PC.
Zephyr May 9 @ 4:42am 
Originally posted by CyberWolf3001:
Originally posted by Bootstrap:
The performance scaling compared to PS5 is terrible. It's not well optimised at all lol. You should be getting 150+ FPS.
i get above 150fps with frame generation, but I've been getting minimum 100fps without it & i've been running at 4k. the PS5 is nothing compared to raw performance of a PC.
Well, there I have to disagree a bit:). There is no way you get 100 FPS at 4K (truly) native, wih 100% render scale, disabled resolution scaling, no upscaling, no framegen+highest possible setting :). Not with an RX 7900 XTX, that is. Actually, except for the case that you are accidentially mistaking your RTX 5090 for an RX 7900 XTX there is NO GPU that you could own which can reliably do this throughout the game :).

I have an RX 7900 XT and in many scenes 60-70 FPS is all that is possible, even with OC under such conditions :). An RX 7900 XTX is 15 % faster in the optimal case which brings us to about 80 FPS (huge stretch there already) in heavier scenes at best :). Interiors and otherwise much lighter scenes might make 100 FPS 4K native, full settings possible, but I doubt it until I see it. In any case, A LOT of the game is playing out in much heavier scenes.

The enthusiast class RX 7000 GPUs are great GPUs, but we need to stay on planet Earth :). Not even a RXT 4090 can sustain 100 FPS 4K native in all conditions (maybe on average, with a last gasp, before melting) lol.

Please compare here
https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html

Second source:

https://www.dsogaming.com/pc-performance-analyses/the-last-of-us-part-2-remastered-pc-performance-analysis/#gid=1&pid=7




And here from my system, a performance test video, with clearly visible settings, comparing 4K native with light XeSS upscaling, sytem with an RX 7900 XT, Ryzen 7800 X3d, Cl30 6000 DDR5, 32 GB:

https://youtu.be/qKISNRphZb8

And a screenshot with such settings, even with OCed GPU.

https://steamcommunity.com/sharedfiles/filedetails/?id=3478209736
Exaggeration is not good on such a topic, especially when easily disproven :).
Last edited by Zephyr; May 9 @ 5:15am
I really hope for s PS4 game. No one would need upscaling in this day and age for this. That would just be absurd if needed.
Originally posted by animal_PLANET:
I really hope for s PS4 game. No one would need upscaling in this day and age for this. That would just be absurd if needed.
The game becomes smoother as the FPS approaches the screen refresh rate. We need this on mid-range cards.
Originally posted by ViktorReznovTR:
Originally posted by animal_PLANET:
I really hope for s PS4 game. No one would need upscaling in this day and age for this. That would just be absurd if needed.
The game becomes smoother as the FPS approaches the screen refresh rate. We need this on mid-range cards.

Huh? All games are smoother hitting native refresh rate. Im not understanding your request.

If someone has trouble playing a game on PC that was built for a console released almost 12 years ago. They're doing it wrong. :steammocking:
Last edited by animal_PLANET; May 9 @ 5:28am
Badman May 9 @ 5:26am 
Originally posted by Zephyr:
Originally posted by CyberWolf3001:
i get above 150fps with frame generation, but I've been getting minimum 100fps without it & i've been running at 4k. the PS5 is nothing compared to raw performance of a PC.
Well, there I have to disagree a bit:). There is no way you get 100 FPS at 4K (truly) native, wih 100% render scale, disabled resolution scaling, no upscaling, no framegen+highest possible setting :). Not with an RX 7900 XTX, that is. Actually, except for the case that you are accidentially mistaking your RTX 5090 for an RX 7900 XTX there is NO GPU that you could own which can reliably do this throughout the game :).

I have an RX 7900 XT and in many scenes 60-70 FPS is all that is possible, even with OC under such conditions :). An RX 7900 XTX is 15 % faster in the optimal case which brings us to about 80 FPS (huge stretch there already) in heavier scenes at best :). Interiors and otherwise much lighter scenes might make 100 FPS 4K native, full settings possible, but I doubt it until I see it. In any case, A LOT of the game is playing out in much heavier scenes.

The enthusiast class RX 7000 GPUs are great GPUs, but we need to stay on planet Earth :). Not even a RXT 4090 can sustain 100 FPS 4K native in all conditions (maybe on average, with a last gasp, before melting) lol.

Please compare here
https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html

Second source:

https://www.dsogaming.com/pc-performance-analyses/the-last-of-us-part-2-remastered-pc-performance-analysis/#gid=1&pid=7




And here from my system, a performance test video, with clearly visible settings, comparing 4K native with light XeSS upscaling, sytem with an RX 7900 XT, Ryzen 7800 X3d, Cl30 6000 DDR5, 32 GB:

https://youtu.be/qKISNRphZb8

And a screenshot with such settings, even with OCed GPU.

https://steamcommunity.com/sharedfiles/filedetails/?id=3478209736
Exaggeration is not good on such a topic, especially when easily disproven :).

4K (Native) + no framegen + near max settings:

I get 50-90 fps on a 4080 Super. In seraphite fights the framerate drops even with dlss in 4k. For native 4k you need a 5090 level card if you want normal framerate.
Last edited by Badman; May 9 @ 5:31am
Originally posted by animal_PLANET:
I really hope for s PS4 game. No one would need upscaling in this day and age for this. That would just be absurd if needed.

well... you can do some math and x8 the ps4 gpu specs. quadruple 1080p + double the framerate. and you gotta up the teraflops to get stable 30 fps. the ps4 didn't manage stable framerate. so... you would theoretically need guesstimated 25 TFlops, 200 GPixels of fillrate and 460 GTexels of texture rate to compute native 4k at 60 fps. the memory bandwidth is hard to math, tho. 1.4 TB/s seems excessive. pretty sure the improved cache design doesn't need all of that. i dunno tho. the core gameplay still gotta be computed. so you gotta double up the ps4 cpu for all the ai, physics and graphics setup. and on pc you have windows api overhead aswell. it's not that easy.
Last edited by episoder; May 9 @ 6:15am
Originally posted by episoder:
Originally posted by animal_PLANET:
I really hope for s PS4 game. No one would need upscaling in this day and age for this. That would just be absurd if needed.

well... you can do some math and x8 the ps4 gpu specs. quadruple 1080p + double the framerate. and you gotta up the tflops to get stable 30 fps. the ps4 didn't manage stable framerate. so... you would theoretically need guesstimated 25 teraflops, 200GPixels of fillrate and 460Gtexels of texture rate to compute native 4k at 60 fps. the memory bandwidth is hard to math, tho. 1.4 TB/s seems excessive. pretty sure the improved cache design doesn't need all of that. i dunno tho. the core gameplay still gotta be computed. so you gotta double up the ps4 cpu for all the ai, physics and graphics setup. and on pc you have windows api overhead. it's not that easy.
If these are not quotes but your own words, my God, what are you? :D
Originally posted by ViktorReznovTR:
If these are not quotes but your own words, my God, what are you? :D

just an average tech nerd and hobby coder. you can read all of those numbers on techpowerup and do the math yourself. they have very detailed specs for gpus. i did the same to guesstimate the game performance on my rig aswell. and nixxes delivered. yep. i'm pretty much cpu limited all together. hmm...
Last edited by episoder; May 9 @ 6:22am
< >
Showing 1-15 of 27 comments
Per page: 1530 50