Cyberpunk 2077

Cyberpunk 2077

View Stats:
think I figured out why its performance is so terrible (VRAM)
It has something to do with VRAM, partly. I knew there was some kind of texture loading problem because in big city areas there'd be these random weird blurry textures on billboards and vending machines. Which I knew that maybe had something to do with the VRAM. But what I didn't know was that it was tanking my performance too.

I just reloaded the game after quitting because sometimes it gets stuck in this loop where my normal 38 fps at 1440p drops to mid-20s fps, not sure why it does this but switching FSR on and off is one of the causes, so I can fix it by quitting the game and reloading. But what I hadn't realized is it improved my fps generally too. I was just leaving this area and was at a 7880mb utilization and was apparently using 2gb of shared memory extra on top of that, so in other words I think Cyberpunk was using something like 10gb of memory at 1440p. But the problem was, I guess it was not unloading the textures from memory, so whereas I normally get 38-42 fps, once I reloaded the game it now is giving me 45fps and only is using 5700mb of VRAM right now. I'm not sure why it does this, is that normal? Or did the devs just screw something up where it should be unloading from video memory and it doesn't?

Either way, I have an 8gb graphics card and I think it is literally a 10gb minimum game at this point, at least on ultra settings with raytracing off. I'd imagine this game to be easily using 10-12gb at 1440p with RT on, and the problem with that is if you don't have enough VRAM I think it's causing these huge noticeable 1% low drops and what I didn't know it was actually lowering the overall fps when it uses up the VRAM. I suspect this was partly involved inwhat was terrible launchday bugs with muddy textures, though it's hard to tell what's still vestiges of it being broken vs working as intended it's just you basically need a 16gb graphics card to comfortably play this game on raytracing 4k and need at minimum a 10-12gb GPU to play at ultra settings RT off at 1440p. Probably the only reason it's under 6gb right now is I reloaded the game in the middle of the desert in a sandstorm. Having those blurry textures is a daily occurrence, but I didn't realize I'd have better fps solely by clearing VRAM.
< >
Showing 1-15 of 25 comments
egg fu May 12, 2023 @ 7:57pm 
the game uses A LOT of vram. on 1440p with 8gb i cannot use RT without decreasing texture quality to medium or turning RT off outright or else the vram will reach 7.8gb usage which will lead to really annoying issues like menu lag not even mentioning degrading performance.

next time i upgrade im getting something with 12gb minimum. 8gb cards are perfect though for 1080p i think.
Originally posted by spacefish:
the game uses A LOT of vram. on 1440p with 8gb i cannot use RT without decreasing texture quality to medium or turning RT off outright or else the vram will reach 7.8gb usage which will lead to really annoying issues like menu lag not even mentioning degrading performance.

next time i upgrade im getting something with 12gb minimum. 8gb cards are perfect though for 1080p i think.
Honestly for 1440p I'd not consider anything below 12gb remotely acceptable for raytracing, 12gb is clearly the absolute minimum for RT right now, which means that in the near future you're going to probably need 16gb minimum. I'd mistakenly thought CDPR was still optimizing their games for nVidia graphics cards which have worse VRAM than AMD. So I figured my 8gb card should've been fine. I had no idea even at 1440p with raytracing off it was going to start swapping to system memory. I also didn't realize it'd tank my fps that much just by needing an additional 2gb.
Psyringe May 13, 2023 @ 1:12am 
Originally posted by Red Star, Blood Moon:
I had no idea even at 1440p with raytracing off it was going to start swapping to system memory.
Resolution is usually not that big of a factor for VRAM usage - while the frame buffer does need more memory if it has to store more pixels, the size of the frame buffer(s) is (even on 4k) still much smaller than the memory needed for textures. High-fidelity textures with several layers (materials etc.) need a _lot_ of memory. That said, things can differ between games and it's possible that resolution is a bigger factor for this one.

Raytracing is indeed very VRAM-hungry too. Ironically, some performance-enhancing techniques (like DLSS 3 frame generation) can also end up _reducing_ performance in certain cases, because they increase the VRAM requirements.

Originally posted by Red Star, Blood Moon:
I also didn't realize it'd tank my fps that much just by needing an additional 2gb.
For a graphics card, fetching data from system memory is a very slow process, compared to having the data in VRAM already. In the past, when someone had low frame rates, the question was usually whether the CPU or the GPU was the bottleneck. Nowadays (and starting with this year in particular), we're seeing situations where neither the CPU nor the GPU can work at full capacity because the bottleneck is the I/O process of shifting data back and forth between VRAM and system RAM. (And we're also seeing cases where a game's graphics engine itself is the bottleneck, but that seems unrelated to Cyberpunk).

In theory, the situation might improve once we see more games using Direct Storage, which can swap data between VRAM and an NVMe SSD drive very quickly. But that technology will also require more VRAM (since the GPU will have to do the texture decompression now), so we'll have to see.

I agree that 12 GB are the minimum if someone wants to play modern triple-A games at ultra settings, even on 1080p. Case in point, a team of crafty Brazilians have created a "homebrew" 3070 with 16 GB of memory, and the performance gain compared to the regular 8 GB variant in modern triple-A games was substantial.
Last edited by Psyringe; May 13, 2023 @ 1:13am
NVY May 13, 2023 @ 4:02am 
If you have 12GB of VRAM, you shouldn't have any problems playing it at 2K resolution with the maximum level of ray tracing. At 4K resolution, it almost reaches 11.8GB of VRAM usage in my case.
Originally posted by Psyringe:
Originally posted by Red Star, Blood Moon:
I had no idea even at 1440p with raytracing off it was going to start swapping to system memory.
Resolution is usually not that big of a factor for VRAM usage - while the frame buffer does need more memory if it has to store more pixels, the size of the frame buffer(s) is (even on 4k) still much smaller than the memory needed for textures. High-fidelity textures with several layers (materials etc.) need a _lot_ of memory. That said, things can differ between games and it's possible that resolution is a bigger factor for this one.

Raytracing is indeed very VRAM-hungry too. Ironically, some performance-enhancing techniques (like DLSS 3 frame generation) can also end up _reducing_ performance in certain cases, because they increase the VRAM requirements.

Originally posted by Red Star, Blood Moon:
I also didn't realize it'd tank my fps that much just by needing an additional 2gb.
For a graphics card, fetching data from system memory is a very slow process, compared to having the data in VRAM already. In the past, when someone had low frame rates, the question was usually whether the CPU or the GPU was the bottleneck. Nowadays (and starting with this year in particular), we're seeing situations where neither the CPU nor the GPU can work at full capacity because the bottleneck is the I/O process of shifting data back and forth between VRAM and system RAM. (And we're also seeing cases where a game's graphics engine itself is the bottleneck, but that seems unrelated to Cyberpunk).

In theory, the situation might improve once we see more games using Direct Storage, which can swap data between VRAM and an NVMe SSD drive very quickly. But that technology will also require more VRAM (since the GPU will have to do the texture decompression now), so we'll have to see.

I agree that 12 GB are the minimum if someone wants to play modern triple-A games at ultra settings, even on 1080p. Case in point, a team of crafty Brazilians have created a "homebrew" 3070 with 16 GB of memory, and the performance gain compared to the regular 8 GB variant in modern triple-A games was substantial.
1. increase prices, decrease value
2. push memes and gimmicks over performance
3. proceed to cripple your ability to use those proprietary features
4. ?????
5. profit!
God it's just so easy to make fun of that company. I had no idea frame generation also used a lot of VRAM. It's just mind boggling to me, like if you really cared about those things you'd demand higher VRAM yet somehow the primary market segment ends up being people saddled with totally crippled 70 cards. I heard about that card, it just amazes me how the 3070 probably could've been a great GPU if it just had more VRAM, instead it's going to go down in history as a GTX 770 2gb at best. It's just so weird trying to imagine how this tactic is actually making them money when clearly nVidia is releasing inferior products at this point from it. Maybe something is a wrong assumption, because I thought they tried to sell out their old gen cards first and hated the used market so it's like, selling your 70s as better than a 2080ti isn't that supposed to be a benefit?

It just amazes me how much of a problem this is on Cyberpunk of all things, I truly expected this game to be well optimized for textures because they wanted to sell 3070's and I just cannot imagine how you'd be using one smoothly with raytracing on. Maybe they were only trying to upsell 3080 10gb cards? Because it feels like anything below a 3080 is definitely inadequate for RT in this game.

I also do have DDR4 but I was expecting I guess more stutters when hitting that limit though in fairness maybe my CtD issue (had over a dozen crashes so far) is tied to running out of VRAM and hitting system memory. I know it's still slower but I wonder if DDR5 is more tolerable when swapping, unless the actual I/O bandwidth is limiting.
jacklonder Jul 20, 2023 @ 6:56am 
Originally posted by Red Star, Blood Moon:
It has something to do with VRAM, partly. I knew there was some kind of texture loading problem because in big city areas there'd be these random weird blurry textures on billboards and vending machines. Which I knew that maybe had something to do with the VRAM. But what I didn't know was that it was tanking my performance too.

I just reloaded the game after quitting because sometimes it gets stuck in this loop where my normal 38 fps at 1440p drops to mid-20s fps, not sure why it does this but switching FSR on and off is one of the causes, so I can fix it by quitting the game and reloading. But what I hadn't realized is it improved my fps generally too. I was just leaving this area and was at a 7880mb utilization and was apparently using 2gb of shared memory extra on top of that, so in other words I think Cyberpunk was using something like 10gb of memory at 1440p. But the problem was, I guess it was not unloading the textures from memory, so whereas I normally get 38-42 fps, once I reloaded the game it now is giving me 45fps and only is using 5700mb of VRAM right now. I'm not sure why it does this, is that normal? Or did the devs just screw something up where it should be unloading from video memory and it doesn't?

Either way, I have an 8gb graphics card and I think it is literally a 10gb minimum game at this point, at least on ultra settings with raytracing off. I'd imagine this game to be easily using 10-12gb at 1440p with RT on, and the problem with that is if you don't have enough VRAM I think it's causing these huge noticeable 1% low drops and what I didn't know it was actually lowering the overall fps when it uses up the VRAM. I suspect this was partly involved inwhat was terrible launchday bugs with muddy textures, though it's hard to tell what's still vestiges of it being broken vs working as intended it's just you basically need a 16gb graphics card to comfortably play this game on raytracing 4k and need at minimum a 10-12gb GPU to play at ultra settings RT off at 1440p. Probably the only reason it's under 6gb right now is I reloaded the game in the middle of the desert in a sandstorm. Having those blurry textures is a daily occurrence, but I didn't realize I'd have better fps solely by clearing VRAM.



I have a RTX 3050 Laptop version ... yes it's tough, but with DLSS it runs at High 1080p 60fps pretty much all the time. My Cpu is pretty good actually, it's an i5 12500H, which can be very helpful running this beast, also i got 32GB of RAM running at 4800Mhz, the game usually sits at around 15GB of RAM usage. It's pretty solid and the game looks pretty even at 1080p, DLSS is set to Quality mode and the sharpness is at .20, I got vsync turned on and frame rate capped at 60fps. As Todd Howard says, it just works!
Azrael Jul 20, 2023 @ 7:01am 
Blurry textures in the game are more likely a LOD issue not vram/rendering issue. The game is bascially thinking you are too far away from that surface to need a high quality textures so it uses a low resolution one instead. vram / gpu does not decide the texture quality on surfaces, it simply renders them. The other option is that you set the graphics settings too low or the dlss settings is set to performance which also once again decreases the texture resolution and quality.
Last edited by Azrael; Jul 20, 2023 @ 7:02am
Blitz4 Jul 20, 2023 @ 8:21am 
geforce now
FREEZED Jul 20, 2023 @ 9:40am 
i think game has some memory leak issues as it will crash after several hours of playing, and you can see it before it happens as graphics start to do something weird.
Gamefever Jul 20, 2023 @ 9:52am 
Originally posted by FREEZED:
i think game has some memory leak issues as it will crash after several hours of playing, and you can see it before it happens as graphics start to do something weird.

I think, as in dont know but the patch state is unfinished right, so maybe this will be resolved when the DLC launches.
Grobut Jul 20, 2023 @ 10:01am 
Originally posted by FREEZED:
i think game has some memory leak issues as it will crash after several hours of playing, and you can see it before it happens as graphics start to do something weird.

Oh it does, and we have bugged locations like Cherry-blossom market where people have been able to reproduce these issues reliably for years.
Azrael Jul 20, 2023 @ 10:36am 
Originally posted by FREEZED:
i think game has some memory leak issues as it will crash after several hours of playing, and you can see it before it happens as graphics start to do something weird.

Fairly sure you have a mod installed that cause the memory leak.
Originally posted by LORd_RiVE:
Blurry textures in the game are more likely a LOD issue not vram/rendering issue. The game is bascially thinking you are too far away from that surface to need a high quality textures so it uses a low resolution one instead. vram / gpu does not decide the texture quality on surfaces, it simply renders them. The other option is that you set the graphics settings too low or the dlss settings is set to performance which also once again decreases the texture resolution and quality.
Nah it's not just that, unless the LOD itself is bugged to hell. I say this mainly because I noticed it more often happens as well as severe frame drops when my GPU has to start switching to system memory because I ran out of VRAM. Which, btw, is pretty ridiculous this f'ing game is used to sell the 3070 when the 3070 is such a piece of garbage you can't even run 1440p all ultra on that, not even RT off, just because I have an 8gb GPU and I've already had problems from running out of VRAM (this game literally needs a 10gb+ GPU for 1440p with RT off, with RT on I'd imagine it's probably up at like 12gb or something but idk).

I've had this happen more with billboards and soda machines, often those that are really near, and sometimes it'll load the rest of the texture. So idk, I had a similar problem like this when I was running The Division on a GTX 980 4gb and it basically, well it's nVidia. So which means you've basically got a hotrod engine strapped to a lawnmower when you go nVidia, ♥♥♥♥ software and stupid thing has problems mainly on account of either bad API or nVidia's bad VRAM as usual, in my case being the 980 works perfectly fine so long as I am not straining it with 1080p ultra or anything that's really pushing those textures. Which sucks, because literally the most important things to how a game looks is textures and lighting, and prior to raytracing going with nVidia basically meant you having to sacrifice gorgeous looks. Now it still does though, because they still don't put enough VRAM, and you need that VRAM for RT too. So I'd basically have still gotten a 16gb Radeon, like at this point I am not even entertaining the idea of ever buying from nVidia again, my Radeon aged beautifully and made me lots of money mining, the software kicks ass, it just werks. But meanwhile my 780 having all kind of problems thanks to its bad API, like I couldn't even get freaking, lowest performance requiring game ever, Warhammer Gladius, which I specifically hoped to be playing on that laptop, because it got switched over to Vulkan now I am not even able to play the new DLCs just because nVidia didn't support Vulkan. And meanwhile on my 980 while it ran The Division perfectly fine, it had such texture popin problems just from the stupid memory config, I swear to God, you know, they HAD 8gb GPUs back then too with AMD.

I just can't believe memevidia is STILL pulling this garbage on gamers with offering us literally half the VRAM we need in order to upsell the stupid halo crap that'll burn your house down and kill your dog like the 4090/3090 if you want VRAM. They literally always put only what is literally the new standard VRAM on the 80ti, which is exactly why the 980ti had 6gb, it's what became 1060 and 1660 and 2060, just the basic standard, just like in all honesty the 1080ti 11gb was the new standard of 10-12gb for super high resolution. Like if I had a 1080ti I'd have the same performance as my 5700XT by its actual GPU die (more or less, 2070S=5700XT=3060) but I'd not be having these texture popin issues. 80 non-ti usually had the bog standard minimum just to not completely age like crap, like the 1080 8gb or 980 4gb but even then, you're still really risking running on the age with nVidia, like the paltry 780ti 3gb like it's not even a bad GPU it's just its VRAM sucks so much, the GTX 770 2gb isn't even playable on most games. Like it runs GTA V just fine, the problem is just that it runs out of VRAM even at 1080p.

So yes in my experience the VRAM issue counts, I had similar problems running Cyberpunk on my 5700XT 8gb @ 1440p as a I did at 1080p/900p running my GTX 980 4gb. Which is unless it actually had the VRAM and there was some other really weird problem with my 980 that made it lag, texture popin, stutter etc. during all ultra benchmarking. Which, you know what I noticed I end up turning nVidia's GimmickTM of the year off half the time anyway like PCSS. They often don't even make good enough hardware to run their own gimmick unless you bought the 80ti and the prices are just nuts. I often feel like I've had fewer headaches dealing with a Radeon than dealing with GeForce stuff.

Originally posted by LORd_RiVE:
Originally posted by FREEZED:
i think game has some memory leak issues as it will crash after several hours of playing, and you can see it before it happens as graphics start to do something weird.

Fairly sure you have a mod installed that cause the memory leak.
That's partly the VRAM I swear to God, there's another game that would do that maybe it was The Division 2? I forget which but it fairly reliably crashed after certain playtime I mean I could be wrong, but what I've noticed is just that the game sometimes slows and gets bogged down in stutters after an hour or two and it goes away when I restart the game. This often coincides with texture issues.
imbrock Jul 20, 2023 @ 2:18pm 
Most new games use a tonne of VRAM. This is why Nvidia has been hamstringing their low and mid end cards with crappy vram limits for a couple of gens now. Its a form of planned obsolescence as texture sizes have been blowing up and nvidia knows it. They tried to make people think less faster memory would make a difference but it just hasn't, you still need large amounts of VRAM for most AAA games these days. 10gb is becoming the minimum for bigger games to play without stuttering and frame drops.
Blitz4 Jul 20, 2023 @ 5:58pm 
geforce now
4080 16GB
free to test
just saying
< >
Showing 1-15 of 25 comments
Per page: 1530 50

Date Posted: May 12, 2023 @ 7:33pm
Posts: 25