Ratchet & Clank: Rift Apart

Ratchet & Clank: Rift Apart

View Stats:
Dysphunc Aug 18, 2023 @ 7:34pm
Same GPU and CPU issues as with Jedi Survivor and OW:SCE - and same lame patching
When JS launched, people with mid and upper end systems had issues with the first level. You could look one was and have 100+fps and pan the camera the other and have 30fps. They patched the game with performance improvements and did some serious work to that level as is now runs far better and performance is consistent...on the first level. So you get beyond that and it runs garbage again.

Interestingly it causes fake GPU bottlenecks. So it's hammering one or two cores on the CPU but total usage is around 50% but the GPU somehow say's it's 99% while drawing less that half of it's maximum power. My GPU usually sits around 150w and 65c at 99%, but when it's hamstung like this it uses 70w and 50c at 99%.

Any other game no matter the frame rate eg. 4k RT running at 30-40fps will use 150w. It's like the GPU is maxed out holding onto frames waiting for the CPU.

Ratchet and Clank has suffered the same fate. On launch it ran like garbage with the same issues - plus a bunch of other issues. They patched it, tidied up and optimized the first few levels and now I'm mid game it's absolute garbage again. 40fps in some parts with FG on no matter what settings.

https://www.youtube.com/watch?v=qAMR5Ghjkmo

Surprisingly OW Spacer's Choice Edition released in the same manor, but now that game has been patched to the point that it (for me at least) runs BETTER than the original release.

So these games can be fixed. They recommend CPU's that can't handle these games in the unoptimized state they're in. People with "Ryzen 3D" chips don't seem to have these issues and some of the newer Intel chips seem fine too. They could just put the recommended CPU specs up or spend more time optimizing the levels.

Pick one devs.


CPU: R5 5600x
RAM: 32GB DDR4 3200
GPU: RTX 4060 Ti 8GB
SSD: 1TB SATA
OS: Windows 11
Res: 2560x1080
Last edited by Dysphunc; Aug 19, 2023 @ 3:14am
< >
Showing 1-15 of 15 comments
ko Aug 19, 2023 @ 3:00am 
2
Nixxes just makes crap ports, they're no better than other port factories. They're completely overrated.

Ratchet & Clank was simply pushed out unfinished like so many other games these days. Console code quickly ported with minimal effort and ♥♥♥♥♥♥ PC engineering.
Phineapple Aug 19, 2023 @ 6:31am 
Same phenomenon here, I refunded the game cause I don't expect any patches addressing this in the near future. They haven't even acknowledged this issue...

PC gaming is becoming a luxury and it's only going to get worse with UE5 becoming the standard.
Last edited by Phineapple; Aug 19, 2023 @ 6:33am
Blacksmith77K Aug 19, 2023 @ 10:04am 
Originally posted by Dysphunc:
On launch it ran like garbage with the same issues - plus a bunch of other issues. They patched it, tidied up and optimized the first few levels and now I'm mid game it's absolute garbage again. 40fps in some parts with FG on no matter what settings.

https://www.youtube.com/watch?v=qAMR5Ghjkmo



CPU: R5 5600x
RAM: 32GB DDR4 3200
GPU: RTX 4060 Ti 8GB
SSD: 1TB SATA
OS: Windows 11
Res: 2560x1080


Just read your opinion, your system and watch your video...

MY opinion:

Yes, this game has maybe some bugs. Ive completed this game with 100% without any bugs, glitches or stutter problems.

Maybe i was able to brutforce some problems, whatever...

I see your 4060ti 8GB with a 128-bit bandwidth. which is possibly identical (performancewise) to a 2080ti 11GB 352-bit from 2018 and your 6 core CPU.

So in fact you play on a high-end performance level of 2018. Thats 5 years ago...

Then you try to play with activated DLAA in 2560x1080p and you claim, that the performance is bad?

I don't want to sound rude, but any new games that are based on the PS5 performance level will not give you a good performance.

A lastgen 6900xt would have given you 50% more frames and 16GB vRAM.


But YEAH i know, its the game...

...everyone wants next-gen games, and when current-gen games will be released, they realize that their own system is pretty much last-gen....
Zazzone Aug 19, 2023 @ 12:55pm 
use dlss quality and frame generator
Dysphunc Aug 19, 2023 @ 2:32pm 
Originally posted by Blacksmith77K:
Then you try to play with activated DLAA in 2560x1080p and you claim, that the performance is bad?

I don't want to sound rude, but any new games that are based on the PS5 performance level will not give you a good performance.

A lastgen 6900xt would have given you 50% more frames and 16GB vRAM.


But YEAH i know, its the game...

...everyone wants next-gen games, and when current-gen games will be released, they realize that their own system is pretty much last-gen....

Did you watch the full video? I showed the game with no upscaling and with upscaling+FG. Both ran poorly.

The memory isn't the issue, I guarantee you - when I hit the 8GB and spill into system memory it causes spikes not fps plummets.

I'm actually starting to think it's a Global Illumination setting that's not on a toggle anywhere in the options. Something CPU intensive that's not being spread across the cores on PC but it on the PS5. Or it's baked lighting on the PS5 and they took it out on PC thinking that PC would perform better. And when the game gets a "performance patch" they throw baked lighting into the first few levels and tune down the GI. That's my only theory as to how they can only fix the first few levels.


And as for those next-gen/current-gen comments...PS5 isn't "next-gen"
It's an 8-core Zen 2 with the equivalent of an RX 5700XT and sharing a pool of 16GB of RAM topping out at 10GB usable by the GPU. According to Passmark the 4060ti is 47% faster than the 5700XT and the r5 5600x is only a modest 12% faster than the equivalent r7 3700x.

Oh and another game - The Last of Remaster - released in bad shape too with the same issues, looks better and runs better than this game now on my system.


Again they need to either up the minimum CPU requirement or change how they release their games.
Medusa Aug 20, 2023 @ 4:18am 
Mate, you are just running out of VRAM, i'd even consider this post spam; acting like you understand something while having not even the slightest clue of what you are on about.
You should never have gotten a 4060 8 GB anyway, it's trash.
8 GB is only sufficient for 1080p in most newer titles.
Also, Frame Generation adds to the memory footprint as well.
Dysphunc Aug 20, 2023 @ 4:35am 
Originally posted by Medusa:
Mate, you are just running out of VRAM
You should never have gotten a 4060 8 GB anyway, it's trash.
8 GB is only sufficient for 1080p in most newer titles.
Also, Frame Generation adds to the memory footprint as well.

My dude, Vram issues cause stutters not framerate dives when you pan the camera.

I'd even consider this post spam; acting like you understand something while having not even the slightest clue of what you are on about.
RevFirst Aug 20, 2023 @ 4:42am 
Originally posted by Medusa:
Mate, you are just running out of VRAM, i'd even consider this post spam; acting like you understand something while having not even the slightest clue of what you are on about.
You should never have gotten a 4060 8 GB anyway, it's trash.
8 GB is only sufficient for 1080p in most newer titles.
Also, Frame Generation adds to the memory footprint as well.
Nixxes shouldn't have recommended 8gb cards like they did at all then. It's on them to sort this out.
Last edited by RevFirst; Aug 20, 2023 @ 4:42am
BSF7772 Aug 20, 2023 @ 4:53am 
the problem with the vram not the game
just lower the texture quality and texture filtering then test it again
the cpu has nothing to do with the vram bottleneck.
Dysphunc Aug 20, 2023 @ 5:20am 
Originally posted by BSF7772:
the problem with the vram not the game
just lower the texture quality and texture filtering then test it again
the cpu has nothing to do with the vram bottleneck.

Vram causes frame spikes not frame dives where you turn the camera one way and sustain a massive fps dip.

I run TLOU fine - Vram spill in that game too and the card handles it fine. RE4 with RT, no issues. It's not the Vram - I get the same result on LOW textures.
Phineapple Aug 20, 2023 @ 1:15pm 
Whether I'm running on 800x600 res and dlssp, everything on low, or 1440p with optimized settings - I get around the same fps, and no, it's not capped.

So all you know-it-all's can stfu already about that vram nonsense
Dysphunc Aug 20, 2023 @ 2:32pm 
Seriously, when a game's first level goes from 40fps to 80fps with a patch it makes you wonder.

When it happens across multiple games it's a problem with development.

-Dead Space Remake
-The Callisto Protocol
-The Outer Worlds SCE
-Jedi Survivor
-The Last of Us
-Forspoken
-Hogwarts Legacy

All of these game were garbage on launch, most of them got patched and now run great with the exception of Jedi Survivor. If you had a high end PC you just kinda "out-leveled" the bad performance, but you still weren't getting the performance you should have.

Everyone cries VRAM!!!! Go watch any YouTube guide on vram and you'll see it causes stutters and dips on PC's with 16GB of system memory or slow memory. It doesn't cause major slowdown that can be fixed by rotating the camera.

Maybe it's the 2 extra CPU cores on the consoles? Maybe it's a different GI implementation?
Maybe PC ports now need official Early Access to do a hardware assessment and patching before release?

The problem's not the vram, go back a couple of years and you'll see cards have been hitting vram limits back then and didn't have bad releases all year.
Dysphunc Aug 21, 2023 @ 5:39am 
So I was totally right - I experimented around and landed on all settings on high except "Level of Detail".

Very Low to Low while standing still is about 20fps difference. Then from there it's about 5fps per step. Usually (but not always) LOD is interchangeable with "Meshes" setting, it used to mean the pop-in distance but now refers to the polygon detail in a visible scene.

Now 2 things:
This setting doesn't just effect poly level, it also scales the baked in AO and lighting quality, foliage density and animation quality.

I think it's a percentage based scale based on 100% being ultra and down to very low guessing 20%.

So when the camera's facing one way on any setting you get 20% of the detail of ultra which could be X-amount of rendered polygons and other details. Pan the camera and the numbers plummet as you're still at 20% of ultra but you might be getting double or triple the details loading in that you just had.

That is an awful way of doing things just hoping peoples systems can just brute through that. That either needs adjusting or the levels all need tending too.

There's no way to performance tune this game to a particular number until this happens.
thomasthegamer123 Aug 23, 2023 @ 4:20pm 
Originally posted by Dysphunc:
Seriously, when a game's first level goes from 40fps to 80fps with a patch it makes you wonder.

When it happens across multiple games it's a problem with development.

-Dead Space Remake
-The Callisto Protocol
-The Outer Worlds SCE
-Jedi Survivor
-The Last of Us
-Forspoken
-Hogwarts Legacy

All of these game were garbage on launch, most of them got patched and now run great with the exception of Jedi Survivor. If you had a high end PC you just kinda "out-leveled" the bad performance, but you still weren't getting the performance you should have.

Everyone cries VRAM!!!! Go watch any YouTube guide on vram and you'll see it causes stutters and dips on PC's with 16GB of system memory or slow memory. It doesn't cause major slowdown that can be fixed by rotating the camera.

Maybe it's the 2 extra CPU cores on the consoles? Maybe it's a different GI implementation?
Maybe PC ports now need official Early Access to do a hardware assessment and patching before release?

The problem's not the vram, go back a couple of years and you'll see cards have been hitting vram limits back then and didn't have bad releases all year.



With Jedi Survivor I always saw CPU Survivor
xxpantherrrxx Aug 23, 2023 @ 7:36pm 
Runs fine for me on my 7900X 4090 system.
< >
Showing 1-15 of 15 comments
Per page: 1530 50

Date Posted: Aug 18, 2023 @ 7:34pm
Posts: 15