Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Really, I would be absolutely happy if all new games looked more or less as good like TLOU 2 AND had its performance.
TLOU2 can be run at 4K native (!) 60+ (actually often more like 70 FPS) with just an RX 7900XT class GPU (GPU limit, OP should have told us the resolution also, otherwise you cannot say anything about scaling). None of the new UE 5 games come even close, and again, usually do not look any better, even natively... .
So, yeah..., it FEELS good for a change even if it might scale badly from PS5 (no idea there, never had a console).
that is not true. to get those kinda fps you need 2x the cpu power of the ps5 to get there. this is the first "bottleneck". the ps5 has 8 cores and 16 threads at 3.5 Ghz to get average 90 fps in performance mode in moderate graphically intensive scenes with moderate npc ai. when more of that ai hits, you look at lower performance, cause the ai and character movement gotta be computed. once you get that scaling to perform at the fps you want, you can think about the rest of the graphics processing. and well... gpus have limitations too. not just teraflops to compute, but also fillrate, memory speed and raster output. there's alot of variables that differ from the ps5 gpu. high end pc gpus are generally more powerful, but you also need the cpu to run the game code and graphics setup. the ps5 does that very quickly cause everything happens on one chip. you seriously underestimate the architecture difference.
PS5 is a closed system and games can be optimized really well for it. You can't compare it to PC ports and PC's. If PC were 1:1 with ps5 in this regard, you'd get the ps5 performance out of crappy old parts like 3600x and 2070. People always forget this.
Maybe I get something wrong somewhere..., but whenever I look into it the PS5 (normal one) has 30 FPS 4K mode or performance mode 60 FPS 1440P as standard. Then there is an unlocked FPS mode that seems to give you maybe up to 90 FPS for 1440P performance mode for TVs that support higher refresh rates (what quality settings is this?). The PS5 (pro) gives you the PSSR upscaling option on top+better RT and otherwise not that much extra for performance. The PS5 Pro GPU is often compared to an RX 7700XT, maybe RX 6800 XT and in the best case an RX 7800 XT (not really likely though) as for what it should be capable of when compared directly hardware-hardware without other considerations.
https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html
From this source you can compare what type of GPU the PS5 (Pro) performance equals to for TLOU2 on PC in reality.
For 1440P native:
Here we land at a RX 7900 GRE-RTX 4070 super type GPU when using 90 FPS 1440 P as a benchmark. Yet these GPUs manage the 90 FPS 1440P with the full settings. I am not sure if the PS5 (Pro) does this as well with the 90 FPS perfromance note. No idea.
For 4K native:
If we assume that a PS5 (Pro) can manage 45 FPS 4K then we get again a RX 6800 XT as equivalent GPU for TLOU2 full settings on PC.
If we sum this up the scaling, just for the GPU is not fully there maybe, but still not that bad. Certainly not off by enough to call it badly optimzed, actually. The game gets to higher refresh rates of 85-90 FPS FPS 1440P, full settings on GPUs that are either higher end, but 5 years old or midrange GPUs for 500-600 $ from the last 2-3 years. I mean, that is not bad by any means.
I have an RX 7900 XT and in many scenes 60-70 FPS is all that is possible, even with OC under such conditions :). An RX 7900 XTX is 15 % faster in the optimal case which brings us to about 80 FPS (huge stretch there already) in heavier scenes at best :). Interiors and otherwise much lighter scenes might make 100 FPS 4K native, full settings possible, but I doubt it until I see it. In any case, A LOT of the game is playing out in much heavier scenes.
The enthusiast class RX 7000 GPUs are great GPUs, but we need to stay on planet Earth :). Not even a RXT 4090 can sustain 100 FPS 4K native in all conditions (maybe on average, with a last gasp, before melting) lol.
Please compare here
https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html
Second source:
https://www.dsogaming.com/pc-performance-analyses/the-last-of-us-part-2-remastered-pc-performance-analysis/#gid=1&pid=7
And here from my system, a performance test video, with clearly visible settings, comparing 4K native with light XeSS upscaling, sytem with an RX 7900 XT, Ryzen 7800 X3d, Cl30 6000 DDR5, 32 GB:
https://youtu.be/qKISNRphZb8
And a screenshot with such settings, even with OCed GPU.
https://steamcommunity.com/sharedfiles/filedetails/?id=3478209736
Exaggeration is not good on such a topic, especially when easily disproven :).
Huh? All games are smoother hitting native refresh rate. Im not understanding your request.
If someone has trouble playing a game on PC that was built for a console released almost 12 years ago. They're doing it wrong.
4K (Native) + no framegen + near max settings:
I get 50-90 fps on a 4080 Super. In seraphite fights the framerate drops even with dlss in 4k. For native 4k you need a 5090 level card if you want normal framerate.
well... you can do some math and x8 the ps4 gpu specs. quadruple 1080p + double the framerate. and you gotta up the teraflops to get stable 30 fps. the ps4 didn't manage stable framerate. so... you would theoretically need guesstimated 25 TFlops, 200 GPixels of fillrate and 460 GTexels of texture rate to compute native 4k at 60 fps. the memory bandwidth is hard to math, tho. 1.4 TB/s seems excessive. pretty sure the improved cache design doesn't need all of that. i dunno tho. the core gameplay still gotta be computed. so you gotta double up the ps4 cpu for all the ai, physics and graphics setup. and on pc you have windows api overhead aswell. it's not that easy.
just an average tech nerd and hobby coder. you can read all of those numbers on techpowerup and do the math yourself. they have very detailed specs for gpus. i did the same to guesstimate the game performance on my rig aswell. and nixxes delivered. yep. i'm pretty much cpu limited all together. hmm...