Cyberpunk 2077

Cyberpunk 2077

View Stats:
NetshadeX Dec 6, 2023 @ 12:17am
I'd argue no PC tech out today can run Cyberpunk 2077 to it's artistic potential.
First and foremost I have to compliment the devs for their continued hard work on this game and the supporting tech to improve the graphical fidelity. The improvements to ray reconstruction in the 2.1 update are impressive!

That said, Cyberpunk 2077 is one of those games that's a real nightmare in terms of raw computational demand that no tech out today can truly meet.

The reason is that the art direction pairs intricate detail with futuristic designs featuring a harsh angular layout and fine lined patterns. On top of that, this is all covered in state of the art diffuse bounce lighting. This is why an increase in resolution hits this game so hard performance wise.

To make all that detail really work the game NEEDS a 4K output at very high refresh rates. This is something no consumer hardware out there today can achieve. Dropping down to even 1440p native makes you lose a noticeable amount of detail in NPC faces for instance.

To make up for that fact we're seeing software solutions to somewhat approach that goal but even though they're getting increasingly impressive, they all introduce a noticeable noise to the image. Whether that's a grainy look, unstable geometry or that infamous oil painting look.

The only way today to achieve an actual "clean" look in this game, even on a monster card like the 4090, is to run it native 4K at 60+ fps with it's RT features disabled.

THAT's possible today, but I think we're at least 2 hardware generations away from seeing this game truly shine.
Last edited by NetshadeX; Dec 6, 2023 @ 12:21am
< >
Showing 1-12 of 12 comments
solamon77 Dec 6, 2023 @ 12:42am 
Yeah. That's not uncommon with CDPR. I remember when Witcher 2 came out I was thinking the exact same thing. I should replay that game now that I have the rig to run it.
QbeX Dec 6, 2023 @ 1:08am 
it will be glorious to come back to this game with rtx 6090, right now DLSS and frame gen band-aid is the way to go
Karax Dec 6, 2023 @ 3:30am 
Originally posted by solamon77:
Yeah. That's not uncommon with CDPR. I remember when Witcher 2 came out I was thinking the exact same thing. I should replay that game now that I have the rig to run it.
Yeah...I remember, damn, that game was such a mess to run
Thier own engine just ass...That's why they are going to use Unreal engine to build the next game
solamon77 Dec 6, 2023 @ 3:36am 
Unreal isn't much better for us PC guys. They still haven't figured out how to stop the constant hitching.
IDK an RTX 4080 and a decent CPU can run it fairly well if you turn on frame gen. Even my trash panda CPU R5 3600 runs it semi well with a few audio bugs here and there with that level of GPU to offset the problem.
Originally posted by Simp Slayer:
Thier own engine just ass...That's why they are going to use Unreal engine to build the next game

I think it's less that the engine isn't ideal, I mean it works 100%, and more that why would they develop redegine further with those insane costs piling up along with technical headaches, growing pains and technical debt when they could just license UE5 which does everything they need, better anyhow?

It'll probably save them millions and thousands of engine development hours. Redegine is probably sticking around for their A-AA titles like Gwent though. There's no need for a better engine for stuff like that since you could get Gwent to play on a toaster as it's a CCG with low spec requirements.
Last edited by Bad Distraction Carnifex; Dec 6, 2023 @ 3:54am
sheap Dec 6, 2023 @ 3:58am 
And this is why I think it's not a so good move to buy 16/9 screen with 4K resolution, i'd much prefer going for lower resolution, ultra wide, with Gsync and try hitting the 120fps if you have something beefy enough to run that at 4K.

Might be because I am old with not so good eyes, but at 50cm from screen on 2560x1080 i really have to make an effort to see pixels.
Last edited by sheap; Dec 6, 2023 @ 4:01am
pagb666 Dec 6, 2023 @ 4:22am 
Just before 2.1 I replayed the game with my 4070, 1440p, overdrive, quality DLSS and frame generation. No noticeable frame drops. So yes, maybe we'll need future hardware to play without reconstruction features, but it doesn't mean you can't play it today, and it still looks out of this world.

Witcher 3 and 2 also had graphic detail settings for future hardware though, so it's nothing new.
Graf Erik Dec 6, 2023 @ 4:38am 
I doubt that there's any engine that would get you better performance. The game is simply massive. So much detail, so many npcs, such am encredible huge and packed map.

I personly am perfectly satisfied with the performance and the look&feel on my 3080. No complains.

But yes, if you really want to max everything and run at stable 60fps, then you'll need hardware thats probably not available right now. And that fine with me.

In Kingdom Come Deliverance (which uses CryEngine) when you switch the graphics to max you get a warning: "This setting is intended for the hardware of the future". And thats pretty cool, since it means when you come back to the game after a few years, it'll look even better.
NetshadeX Dec 6, 2023 @ 8:02am 
Originally posted by pagb666:
Just before 2.1 I replayed the game with my 4070, 1440p, overdrive, quality DLSS and frame generation. No noticeable frame drops. So yes, maybe we'll need future hardware to play without reconstruction features, but it doesn't mean you can't play it today, and it still looks out of this world.

Witcher 3 and 2 also had graphic detail settings for future hardware though, so it's nothing new.

Absolutely, in fact there's plenty of games out right now that struggle on today's hardware like Alan Wake 2 or indeed The Witcher games back in the day. The difference is you can play Alan Wake 2 or The Witcher at 1440p or even 1080p without losing much in terms of presentation or detail. It's art direction doesn't demand the same level of fidelity that Cyberpunk does. Mostly because the environments in AW2 and Witcher are more organic and therefore more forgiving to imperfections. That's also why these games also work at 30 fps. The added "muddyness" to the image isn't overly distracting there.

Anyway...I just can't wait for CP2077 to be playable at native 4K/120fps so it's art style is truly done justice. No shimmering, ghosting, artifacts, grain or distortion. Just clean lines and fine detail bringing Night City to full graphical glory. ;)
< >
Showing 1-12 of 12 comments
Per page: 1530 50

Date Posted: Dec 6, 2023 @ 12:17am
Posts: 12