Ratchet & Clank: Rift Apart

Ratchet & Clank: Rift Apart

View Stats:
Very bad performance on 3070
I play the game on Asus ROG Zephyrus G15 laptop which has RTX 3070, RYZEN 9 5900HS CPU and 16GB Ram on Windows 10.

I updated everything to the latest version including the graphics card drivers.

I play on high settings, 1440P, NO Rt and DLSS: on. I get like 20 - 50 FPS. It drops to 20s even 18- 19 fps when there is a fighting and goes to 50s when there are no lots of characters on the screen. But it becomes unplayable when it dips to 20s.

I tried everything including restarting the game and pc several times, changing vsync options, setting the texture option to low, running on administration, chaging dpi settings...etc Nothing works.

I had no problem playing The last of us Part 1 on high settings.

So, it seems, for the vast majority of people here the game works just fine. Is there anyone else who gets abnormally low performance like me?
< >
Showing 1-15 of 63 comments
ELMatadario Jul 27, 2023 @ 3:31am 
Originally posted by amethystium1984:
I play the game on Asus ROG Zephyrus G15 laptop which has RTX 3070, RYZEN 9 5900HS CPU and 16GB Ram on Windows 10.

I updated everything to the latest version including the graphics card drivers.

I play on high settings, 1440P, NO Rt and DLSS: on. I get like 20 - 50 FPS. It drops to 20s even 18- 19 fps when there is a fighting and goes to 50s when there are no lots of characters on the screen. But it becomes unplayable when it dips to 20s.

I tried everything including restarting the game and pc several times, changing vsync options, setting the texture option to low, running on administration, chaging dpi settings...etc Nothing works.

I had no problem playing The last of us Part 1 on high settings.

So, it seems, for the vast majority of people here the game works just fine. Is there anyone else who gets abnormally low performance like me?
same here, rtx 2080 and ryzen 5600x, 16 gb ram dual channel, can't stick to 60 fps with a mix of high and ultra settings at 1440p any dlss, something's wrong
AskaLangly Jul 27, 2023 @ 3:44am 
How is a 9th-gen Intel with a 3060 running the game better than these AMD setups?
Can't be my 64GB RAM... or Windows 11...
Blackguard Jul 27, 2023 @ 3:54am 
Supposedly, some people had to update their bios for some reason to get this game to perform well. Idk why but maybe look into trying that?
Matt0040 Jul 27, 2023 @ 4:05am 
If your computer managed to render the parade but stutters at combat, it's gotta be something other than your specs. My game's never dipped below 60 and I have very similar specs. I will mention Steam's integration with the game isn't very good, and did screw up my controller. Maybe disable the steam overlay for the game
Willpower Jul 27, 2023 @ 4:05am 
With a 3070/3700x/32GB 3600/NVMe 3.5gb 970 Pro:

I can't even hit 60fps at low settings. Sure I am at 4k which is pushing it, but I'm running with DLSS targetting 60 and even manually setting it to ultra perf. But still, it just doesn't change a thing, it just can't hit 60fps. I realize my CPU can be a bottleneck, but checking over all my usage my 3070 is constantly pegged at 100% no matter what, and I really mean at all times. Very low, medium, max, RT, DLSS ultra perf, no matter the scene, no matter anything really, meanwhile my CPU is sitting at 20% and even the most saturated cores and threads never go past 50%. Big difference from Spiderman which would basically 100% my CPU at all times (and also run terribly).

Funnily enough, low can't even hit 60fps but running at high and ray tracing can just about hold 30fps. The scaling doesn't really make much sense in my head, in fact high settings with RT on and off only gives a 6fps difference which is the lowest difference I've ever seen for RT, most games will happily take away 1/3 perf when toggling any RT on and of course give it back when toggling it off. Either Ratchets RT is the most optimized thing in the world or something just isn't right.

It's just a shame when this exact thing happened with Spiderman, in both cases I almost match the high settings requirements, but adjust for the lowest common denominator which is my CPU, so here I'd expect medium to work as advertised. But no, I can't even hit what they advertise as the low spec, it's basically busted. Meanwhile my Steam Deck runs it great relatively speaking and honestly isn't far off what I'm doing on my PC beyond the resolution difference. It just doesn't add up on any level.
itzDerrio Jul 27, 2023 @ 4:10am 
$ony poorly optimizes their games on purpose to sway ps5 sales as the better platform so that we can all be coerced into paying for their monthly BS plus subscription
BingusDingus Jul 27, 2023 @ 4:11am 
I have an RTX 3080 paired with a 3900x, with 32gbs of 3800mhz Trident Z ram, with the game installed on my main M2 SSD and i get around 120fps at 2k with mostly very high settings with DLSS set to quality. If i turn some settings down i can easily achieve over 200fps.
When i first installed the game i had issues until i restarted my computer, then it ran flawlessly.
TimmyP Jul 27, 2023 @ 7:15am 
5800x3d
3070
32gb

Runs great here ~50fps no stutters. Absolute highest settings. RT maxed. 1440p DLSSp

The one problems Ive seen so far is shader compilation
Last edited by TimmyP; Jul 27, 2023 @ 7:15am
BEEP! Jul 27, 2023 @ 7:43am 
It seems like a Nvidia card problem with a RX6800&5800x3d 3440x1440p playing max settings RT, no upscaling it was usually 80-100 it did run kinda bad for for the first tutorial mission till it got to the open ended content on the first planet it shot up to the average high 80s-100s.

I did notice if you close the game & reopen it you'l have better performance like when it got to the 2nd planet jungle planet it was high 60s then I turned ran around fur a couple mins & restarted the game then it was almost in the 80s.

So to me on my system it ran rather nice for my roommate which basically is the exact rig but with a 3080-10gb it struggled a lot more even without RT it seems like a VRAM thing we're they were constantly almost maxing VRAM & performance was suffering when I had no such problem with my RX6800 so yea AMD seems to be doing better because on average AMD cards have more spare VRAM than the average Nvidia card so if your on a Nvidia card you may want to wait till a patch or a new set of driver's.

PS. The cards I really see struggling are all cards under 12GBVRAM.
Last edited by BEEP!; Jul 27, 2023 @ 7:46am
ELMatadario Jul 27, 2023 @ 7:47am 
Originally posted by TimmyP:
5800x3d
3070
32gb

Runs great here ~50fps no stutters. Absolute highest settings. RT maxed. 1440p DLSSp

The one problems Ive seen so far is shader compilation
"runs great" , "50 fps" "dlss performance" ...bro..are you for real? do you realize you are upscaling from sub 720p with dlss performance? and still no 60 fps..you all have absolute trash expectations of your hardware expecially when other games which look the same or even better, like rdr2 or the other spiderman games, run so much better.
Melcor Jul 27, 2023 @ 10:35am 
Originally posted by ELMatadario:
Originally posted by TimmyP:
5800x3d
3070
32gb

Runs great here ~50fps no stutters. Absolute highest settings. RT maxed. 1440p DLSSp

The one problems Ive seen so far is shader compilation
"runs great" , "50 fps" "dlss performance" ...bro..are you for real? do you realize you are upscaling from sub 720p with dlss performance? and still no 60 fps..you all have absolute trash expectations of your hardware expecially when other games which look the same or even better, like rdr2 or the other spiderman games, run so much better.

From what I've seen on tests, it seems like people are getting the same performance no matter the DLSS setting, so his 3070 would get the same framerate in quality DLSS. What's very weird is that a 3070 uses about 210 watts of power but this game only uses like 140.
J0ust Jul 27, 2023 @ 10:41am 
Originally posted by itzDerrio:
$ony poorly optimizes their games on purpose to sway ps5 sales as the better platform so that we can all be coerced into paying for their monthly BS plus subscription
Hey, Einstein, you know Sony had absolutely nothing to do with this port, right? Come back in a few years when you're older and wiser.
ELMatadario Jul 27, 2023 @ 10:53am 
Originally posted by Melcor:
Originally posted by ELMatadario:
"runs great" , "50 fps" "dlss performance" ...bro..are you for real? do you realize you are upscaling from sub 720p with dlss performance? and still no 60 fps..you all have absolute trash expectations of your hardware expecially when other games which look the same or even better, like rdr2 or the other spiderman games, run so much better.

From what I've seen on tests, it seems like people are getting the same performance no matter the DLSS setting, so his 3070 would get the same framerate in quality DLSS. What's very weird is that a 3070 uses about 210 watts of power but this game only uses like 140.
same, my (rtx 2080) gpu doesn't use much watts and doesn't get as hot as in other games (even if at 99% utilization), something is definetly wrong, imho the decompression utilized in the direct storage is "stealing" some performance to the graphic engine and rendering itself, and being decompression probably less taxing on the gpu cores, this happens.
Last edited by ELMatadario; Jul 27, 2023 @ 10:53am
TimmyP Jul 27, 2023 @ 11:06am 
SORRY! Game is STUNNING and VRR range smooth. Highest settings, RT all, 1440p DLSSp.

You could not tell a ♥♥♥♥♥♥♥ difference. DLSSp is integer mode, and 720p is HD. Consider the same size monitor, and understand that 1080p DLSSq <= 1440p DLSSp because they have the same base.

You are just another ****** who cant understand that DLSS is almost a complete waste on 1080p, and you are not really using the hardware. Derr oh yeah 720p base want some screenshots so I can make you look like a true fool? Absolutely insane. GET OUT OF THIS HOBBY.

*I dont even see a ♥♥♥♥♥♥♥ chug. HIGHEST POSSIBLE!
Last edited by TimmyP; Jul 27, 2023 @ 11:07am
ELMatadario Jul 27, 2023 @ 11:10am 
Originally posted by TimmyP:
SORRY! Game is STUNNING and VRR range smooth. Highest settings, RT all, 1440p DLSSp.

You could not tell a ♥♥♥♥♥♥♥ difference. DLSSp is integer mode, and 720p is HD. Consider the same size monitor, and understand that 1080p DLSSq <= 1440p DLSSp because they have the same base.

You are just another ****** who cant understand that DLSS is almost a complete waste on 1080p, and you are not really using the hardware. Derr oh yeah 720p base want some screenshots so I can make you look like a true fool? Absolutely insane. GET OUT OF THIS HOBBY.
that's what i said though? can you even read? dlss quality at 1080p looks like ass, so does 1440p dlss performance, every ♥♥♥♥♥♥♥ person on this planet (including multiple reviewers and tech channels) except you agree on the fact that dlss quality at 1440p and 4k quality/balanced are the only acceptable options before starting to lose a bunch of crispness of the image.
i'm fine with you playing with that settings, i'm not fine with you saying 1440p dlss performance looks the ♥♥♥♥♥♥♥ same of dlss quality at 1440p, get your ♥♥♥♥ togheter and move the camera instead of looking at still shots
< >
Showing 1-15 of 63 comments
Per page: 1530 50

Date Posted: Jul 27, 2023 @ 3:25am
Posts: 63