Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Also, you can try without the overclock. The game really hammers the system now so if you're on the border of stability you could be dropping frames badly.
144 fps average using the benchmark
During normal gameplay 100+ the majority of the time.
What really should be the "right" requirement ? I understand talking like this if the game was like "World of Warcraft level graphics" requiring a good machine.
I myself think what this game delivers is adequate to what it requires.
If it was UE5 delivering the same thing, it would require more to run on average, unless the devs made a really good job in optimizing the use they make of the engine.
I run mine in a Gaming Asus Rog Laptop and in my Desktop. One barely the recommended the other way over the recommended to run Ultra. Both seem to run it nicely in their best graphics setup.
But FPS will not drop only because your machine is "technically" incapable of keeping it. The quality of your storage, even the health of your fans, many things will interfere with achieving high FPS.
Amount of bottleneck depends on area of the map and npc/script amount. For example If i was in the 15% area then a big shootout happened with lots of enemies & explosions happening on screen then it often hits the 10% - below.
75% of the time no bottleneck, 100-140 fps avg., 95+% GPU usage
15% of the time light bottleneck, 80-100 fps avg., 80-90% GPU usage
10% of the time big bottleneck, 60-80 fps avg., 70s% GPU usage. (without FG this dips below 60 FPS)
need a 7800x3d or 13900k to get high fps all the time.
Basic research will go a long way, as I've noticed recently most pc gamers have no idea how to optimize their hardware and are to quick to blame the developers for their own faults.
Ffs pre 2.0 I could get a solid 60fps medium settings through the game on an rx580 and 3600x combo with only very basic bios tuning.
To be more clear, because of frame generation, my 5800X is only rendering between 30 and 36 fps. (My monitor is 75 Hz max and I impose a 72 fps frame limit for adaptive-sync smoothness). But even with all of those "tricks" turned off (no v-sync, no frame generation, no frame caps) the CPU almost always keeps up. In fact, only a few instances of loading new areas does it bottleneck for a second or two.
Even when fighting with 10 cops at once, my CPU isn't a bottleneck... I tested all of this with Intel's PresentMon.
With Path Tracing and Frame Generation on, at 1440p, a 5800x is a good match. I do have the AMD SMT setting set to auto... which should enable SMT with this 8 core processor.
A HUGE performance factor is that the 2.0 patch implements Opacity Micro-Maps which GREATLY increases ray-tracing performance with transparencies... windows, foliage, smoke. BUT this feature is only on the RTX 4000 series cards. Go to the memorial park... All of the foliage use to tank RT performance, now it hardly does anything.
Although I haven't tried the Phantom Liberty areas.
I no longer have a desire to play this game without Path Tracing, so yes; there could be a sub-60 CPU bottleneck; but until the RTX 5090 comes out, I won't care.
Yea I'ts playable, i could lock fps too, but then I'm trading 100 fps for 60 most of the time when there is no bottleneck.
What I meant is you need a 7800x or 13900k for "very high" fps - no bottleneck with a 4090 for happy 100fps.