Cyberpunk 2077

Cyberpunk 2077

View Stats:
Best settings to turn down? performance vs visual issues thread
For performance, which ones should I do? What settings are having the most impact on performance, and which ones don't have more than minimal visual improvement?

I am ironically for the first time since the pandemic started actually having to think about these things than blindly just cranking everything to ultra, gonna have to do the same for Farcry 5 if I want to play it on mobile.
I mean, I suppose I could just simply not play on 1440 and turn down to 1080p mode or I can use an upscaler, but I really dont wanna use an upscaler. I am getting about 42fps on ultra which is more or less doableidk I'd have noticed if I didn't turn an fps counter on, which it's pretty much console quality or better anyway but could stand to gain another 10fps for better playability and am betting I'll have drops down into the low 30s once I'm in a firefight in sufficiently crowded streets. How much does turning down crowd size affect performance vs making the city less lifelike? What do you guys using 1060s and 1070s and 2060 non-supers use? Maybe I can try bearing FSR, in all honesty it's really not *that* bad it's just that I don't like upscaling rendering 100% of the game with visual degrade.

I'm pretty shocked how badly this runs although in fairness Watchdogs Legion also runs poorly albeit not this bad, although in fairness Deus Ex: Mankind Divided looked worse and had smaller much less busy hubs and still struggled to keep 60fps+ so in a lot of ways I can't help but mentally compare this to DE:MD too. It looks excellent but not so superb I can see why a bar is making a 5700XT struggle.in fairness more and more games are more demanding these days and Total Warhammer 3 I think is supposed to run like this too, 40s fps or worse.

I am also shocked that it only uses like 7gb of VRAM at 1440p ultra. CDPR really deserves credit for consistently doing this so their games can run on particularly older nVidia cards. The way it looks I'd expect it taking 11gb+
< >
Showing 1-9 of 9 comments
egg fu Apr 22, 2023 @ 2:54am 
screen space reflection has the biggest impact on performance, turning it to high or medium will give serious fps, especially going from ultra to high with barely any quality impact. you can try turning local shadow quality, volumetric fog, and ambient occlusion to medium as well, this gives an extra 3-5 fps in my experience. everything else can be on ultra, no rt of course.

without any upscaler on these settings i had like an 80% increase in fps. went from 35 avg to about 60 on 1440p w/ 3060 ti, no rt.

also on the topic of upscalers, dlss excluded id much prefer xess over fsr if i had the choice, fsr in this game always had that blur that always annoyed me, not to mention the awkward shimmering even on quality. xess is just better in these fields in my few minutes of testing. but.... if i had a 4090 or 80 id be using dlaa over everything else, its just too good not to have enabled.
Bishop-Six Apr 22, 2023 @ 3:32am 
Imo crowd density low is already fine and playble.

I play on medium crowd and its fits perfectly for me. Sometimes in some areas even too much traffic / NPCs.

Also its worth to have a look on HDD Mode. Even if playing on SSD, activating this can help by streaming issues for medium / low end PCs. But that comes at cost of diversity of NPCs, its more likely to get some doubles.

But the performance improvement is worth it imo (for medium / low end PCs).
Moldon Apr 22, 2023 @ 3:45am 
1070 Ti, 1200p, settings on high, 35-65 fps. Honestly, I don't find the game very impressive graphics-wise, I guess because the lack of shiny things like Ray-tracing and stuff, but at the same time performance is very good so can't complain.
Last edited by Moldon; Apr 22, 2023 @ 3:46am
Zebedee Apr 22, 2023 @ 4:30am 
in fairness more and more games are more demanding these days and Total Warhammer 3 I think is supposed to run like this too, 40s fps or worse

It's just that sadly for older GPUs. 10xx cards got hit hard by performance gap at launch (I was running a 1080 @1440p, high settings and ~40fps). Once you start moving up into 30xx and 6xxx series then performance is fine at 1440p, and upscaling gives a trade-off for RT. Game's a tech willy wave, which bit them on the ass at launch on the older hardware.
GravityWave Apr 22, 2023 @ 6:06am 
You could try dropping the frame rate limit to, say, 30 fps. I usually run at 60 fps rate limited, but it helps a lot at 30.
SK-HYJINX Apr 22, 2023 @ 6:37am 
XeSS is pretty neat. Their highest quality option produces great results with a higher baseline than the other options.
Example....
75% of Native Res is used for Ultra Quality XeSS
66.66% of Native Res is used for Quality Nvidia DLSS & Quality AMD FSR

If you have the GPU horsepower leftover, try out XeSS 75% Scale over DLSS/FSR 66.66%
Still gives a boost over Native of course, looks pretty great also..
Last edited by SK-HYJINX; Apr 22, 2023 @ 6:38am
Originally posted by GravityWave:
You could try dropping the frame rate limit to, say, 30 fps. I usually run at 60 fps rate limited, but it helps a lot at 30.
Wait what why would I even want to do that? It kills like 12 fps, which is roughly how much fps I'd want to gain although admittedly I can't even tell the difference yet, it's just 75fps I think would be slightly smoother. It's hard to explain but I do notice the difference between machines with Witcher, since the one is literally double the performance so it feels smoother, though I still find 38 fps perfectly playable. Just funny to me feeling like I get the same ballpark performance on two different machines for two different eras of CDPR games.



Originally posted by SK-HYJINX:
XeSS is pretty neat. Their highest quality option produces great results with a higher baseline than the other options.
Example....
75% of Native Res is used for Ultra Quality XeSS
66.66% of Native Res is used for Quality Nvidia DLSS & Quality AMD FSR

If you have the GPU horsepower leftover, try out XeSS 75% Scale over DLSS/FSR 66.66%
Still gives a boost over Native of course, looks pretty great also..
So people keep talking about XeSS, is it that decent and did that many people really switch to Arc? I heard it was mad buggy at launchquite rich for a 5700XT owner to say but it's true, I got my GPU after the bugs were fixed. I kinda assumed not many people were on it but it's starting to seem like plenty A750 and A770 owners out there. I say this because obviously I thought the XeSS toggle was disabled for me like DLSS.


Originally posted by O.Gefr.Löring.363VD:
Imo crowd density low is already fine and playble.

I play on medium crowd and its fits perfectly for me. Sometimes in some areas even too much traffic / NPCs.

Also its worth to have a look on HDD Mode. Even if playing on SSD, activating this can help by streaming issues for medium / low end PCs. But that comes at cost of diversity of NPCs, its more likely to get some doubles.

But the performance improvement is worth it imo (for medium / low end PCs).
Interesting point. And also that that's right I'd technically be on a lower end machine at this point, probably something like a 4050 or RTX 4050ti in performance level. It's been a good ride. I'd not have thought HDD mode would help much. Crowd density also was something I's eyeing as a possible fix since I've pretty much anticipated drops as low as 25 in crowded enough hubs.



Originally posted by Zebedee:
in fairness more and more games are more demanding these days and Total Warhammer 3 I think is supposed to run like this too, 40s fps or worse

It's just that sadly for older GPUs. 10xx cards got hit hard by performance gap at launch (I was running a 1080 @1440p, high settings and ~40fps). Once you start moving up into 30xx and 6xxx series then performance is fine at 1440p, and upscaling gives a trade-off for RT. Game's a tech willy wave, which bit them on the ass at launch on the older hardware.
Yeah it's like I'm finally noticing partly just because I run on Russia time, i.e. my clock is set to about 2 or 10 years behind everybody else. Because I don't early adopt anything usually waiting for bug fixes and testing etc.I literally did this ♥♥♥♥ with everything even the vaccines, since they gave to medical staff first and I get put on a later tier in queu I figured they're not going to poison doctors with something dodgy and I got 6+ months watching other people take it first to find out if there's side effects, other than J&J there weren't, so yeah I'll buy an Arc or 3080 or 5700XT just not before watching a first wave go first to find out if maybe smoking the spice is a bad idea it makes me have a 2 year delay on nearly all games with the sole exception being plenty of indie studios release these early access betas that languish in ea for years, that I usualy get only because I want to support the developers and actually help get the game itself made. I watch reviews so I feel like I mostly haven't got burned on this. I can understand doing it for a small indie studio but no idea why anyone would preorder AAA games. I mean it isn't, it's not like they're going to run out of inventory at the digital store and be all "sorry we're all sold out, sold muh last copy of Cyberpunk last night and we aren't getting another shipment of Steam keys 'til next Friday."

So really this whole time I hadn't even realized I had any problem brewing, especially since most AAA games I had no interest in, although frankly slogging through enough lukewarm indie stuff has made me miss truly polished AAA experiences. There's something more slick and premium about games like Witcher 3, that doesn't feel as bad a waste of time, often reflected worst on the writing though some of the better written games I played last decade were 2 from inXile and not triple A. I knew TW3 was going to run like ♥♥♥♥ because in general Total Warhammer games run like ♥♥♥♥ on AMD, and I didn't care about Watchdogs Legion but ended up getting it anyway so idk maybe I'll do a cyberpunk summer and finish The Ascent too. It's literally been since 2016 the last time there was a big enough bump in game development to notice my hardware aging, although this time it's because I spent the entire pandemic leaving ultra settings on by default at 1440p and it's been great. I do feel like my not so high end panel is blurrier af on Cyberpunk though, mainly the motion so I'm less inclined to use FSR or any upscaling when already I'm having smear and blur problems with it off. Thanks for your guys time.

Originally posted by abolishbananas:
screen space reflection has the biggest impact on performance, turning it to high or medium will give serious fps, especially going from ultra to high with barely any quality impact. you can try turning local shadow quality, volumetric fog, and ambient occlusion to medium as well, this gives an extra 3-5 fps in my experience. everything else can be on ultra, no rt of course.

without any upscaler on these settings i had like an 80% increase in fps. went from 35 avg to about 60 on 1440p w/ 3060 ti, no rt.

also on the topic of upscalers, dlss excluded id much prefer xess over fsr if i had the choice, fsr in this game always had that blur that always annoyed me, not to mention the awkward shimmering even on quality. xess is just better in these fields in my few minutes of testing. but.... if i had a 4090 or 80 id be using dlaa over everything else, its just too good not to have enabled.
THis in particular is super helpful
Zebedee Apr 23, 2023 @ 12:34am 
Originally posted by Red Star, Blood Moon:
So really this whole time I hadn't even realized I had any problem brewing, especially since most AAA games I had no interest in, although frankly slogging through enough lukewarm indie stuff has made me miss truly polished AAA experiences. There's something more slick and premium about games like Witcher 3, that doesn't feel as bad a waste of time, often reflected worst on the writing though some of the better written games I played last decade were 2 from inXile and not triple A. I knew TW3 was going to run like ♥♥♥♥ because in general Total Warhammer games run like ♥♥♥♥ on AMD, and I didn't care about Watchdogs Legion but ended up getting it anyway so idk maybe I'll do a cyberpunk summer and finish The Ascent too. It's literally been since 2016 the last time there was a big enough bump in game development to notice my hardware aging, although this time it's because I spent the entire pandemic leaving ultra settings on by default at 1440p and it's been great. I do feel like my not so high end panel is blurrier af on Cyberpunk though, mainly the motion so I'm less inclined to use FSR or any upscaling when already I'm having smear and blur problems with it off. Thanks for your guys time.

It's understandable. Think for this game in particular, the higher settings were always intended to push the envelope a bit, and that on top of a game which is based around a dense urban environment with minimal loading screens. Which would be fine but for them also wanting it to run 'well' on 8 year old hardware too (old gen consoles and original min spec GPU was a 780 - it's a 970 now). On the flip side, the game can be tailored to hardware within the min specs.

One thing to keep in mind is that the game engine will crap the bed if settings aren't in line with hardware, so if you ever get to seeing lots of visual jank (eg animations not happening) then dial back a bit more on the settings to see if it reduces frequency of things like T-posing happening. Good luck with it.
feetlover16 Apr 23, 2023 @ 11:40am 
https://www.youtube.com/watch?v=pC25ambD8vs&t=827s
Old video but it still checks out. Personally I play at high fsr 2 quality with 30 fps lock and vsync off on my 1650 and with motion blur and reflex it feels quite smooth. Also I assume the game doesn't use that much vram because of inconsistent texture quality, even at max setting
< >
Showing 1-9 of 9 comments
Per page: 1530 50

Date Posted: Apr 22, 2023 @ 2:18am
Posts: 9