Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Ratchet & Clank was simply pushed out unfinished like so many other games these days. Console code quickly ported with minimal effort and ♥♥♥♥♥♥ PC engineering.
PC gaming is becoming a luxury and it's only going to get worse with UE5 becoming the standard.
Just read your opinion, your system and watch your video...
MY opinion:
Yes, this game has maybe some bugs. Ive completed this game with 100% without any bugs, glitches or stutter problems.
Maybe i was able to brutforce some problems, whatever...
I see your 4060ti 8GB with a 128-bit bandwidth. which is possibly identical (performancewise) to a 2080ti 11GB 352-bit from 2018 and your 6 core CPU.
So in fact you play on a high-end performance level of 2018. Thats 5 years ago...
Then you try to play with activated DLAA in 2560x1080p and you claim, that the performance is bad?
I don't want to sound rude, but any new games that are based on the PS5 performance level will not give you a good performance.
A lastgen 6900xt would have given you 50% more frames and 16GB vRAM.
But YEAH i know, its the game...
...everyone wants next-gen games, and when current-gen games will be released, they realize that their own system is pretty much last-gen....
Did you watch the full video? I showed the game with no upscaling and with upscaling+FG. Both ran poorly.
The memory isn't the issue, I guarantee you - when I hit the 8GB and spill into system memory it causes spikes not fps plummets.
I'm actually starting to think it's a Global Illumination setting that's not on a toggle anywhere in the options. Something CPU intensive that's not being spread across the cores on PC but it on the PS5. Or it's baked lighting on the PS5 and they took it out on PC thinking that PC would perform better. And when the game gets a "performance patch" they throw baked lighting into the first few levels and tune down the GI. That's my only theory as to how they can only fix the first few levels.
And as for those next-gen/current-gen comments...PS5 isn't "next-gen"
It's an 8-core Zen 2 with the equivalent of an RX 5700XT and sharing a pool of 16GB of RAM topping out at 10GB usable by the GPU. According to Passmark the 4060ti is 47% faster than the 5700XT and the r5 5600x is only a modest 12% faster than the equivalent r7 3700x.
Oh and another game - The Last of Remaster - released in bad shape too with the same issues, looks better and runs better than this game now on my system.
Again they need to either up the minimum CPU requirement or change how they release their games.
You should never have gotten a 4060 8 GB anyway, it's trash.
8 GB is only sufficient for 1080p in most newer titles.
Also, Frame Generation adds to the memory footprint as well.
My dude, Vram issues cause stutters not framerate dives when you pan the camera.
I'd even consider this post spam; acting like you understand something while having not even the slightest clue of what you are on about.
just lower the texture quality and texture filtering then test it again
the cpu has nothing to do with the vram bottleneck.
Vram causes frame spikes not frame dives where you turn the camera one way and sustain a massive fps dip.
I run TLOU fine - Vram spill in that game too and the card handles it fine. RE4 with RT, no issues. It's not the Vram - I get the same result on LOW textures.
So all you know-it-all's can stfu already about that vram nonsense
When it happens across multiple games it's a problem with development.
-Dead Space Remake
-The Callisto Protocol
-The Outer Worlds SCE
-Jedi Survivor
-The Last of Us
-Forspoken
-Hogwarts Legacy
All of these game were garbage on launch, most of them got patched and now run great with the exception of Jedi Survivor. If you had a high end PC you just kinda "out-leveled" the bad performance, but you still weren't getting the performance you should have.
Everyone cries VRAM!!!! Go watch any YouTube guide on vram and you'll see it causes stutters and dips on PC's with 16GB of system memory or slow memory. It doesn't cause major slowdown that can be fixed by rotating the camera.
Maybe it's the 2 extra CPU cores on the consoles? Maybe it's a different GI implementation?
Maybe PC ports now need official Early Access to do a hardware assessment and patching before release?
The problem's not the vram, go back a couple of years and you'll see cards have been hitting vram limits back then and didn't have bad releases all year.
Very Low to Low while standing still is about 20fps difference. Then from there it's about 5fps per step. Usually (but not always) LOD is interchangeable with "Meshes" setting, it used to mean the pop-in distance but now refers to the polygon detail in a visible scene.
Now 2 things:
This setting doesn't just effect poly level, it also scales the baked in AO and lighting quality, foliage density and animation quality.
I think it's a percentage based scale based on 100% being ultra and down to very low guessing 20%.
So when the camera's facing one way on any setting you get 20% of the detail of ultra which could be X-amount of rendered polygons and other details. Pan the camera and the numbers plummet as you're still at 20% of ultra but you might be getting double or triple the details loading in that you just had.
That is an awful way of doing things just hoping peoples systems can just brute through that. That either needs adjusting or the levels all need tending too.
There's no way to performance tune this game to a particular number until this happens.
With Jedi Survivor I always saw CPU Survivor