Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
next time i upgrade im getting something with 12gb minimum. 8gb cards are perfect though for 1080p i think.
Raytracing is indeed very VRAM-hungry too. Ironically, some performance-enhancing techniques (like DLSS 3 frame generation) can also end up _reducing_ performance in certain cases, because they increase the VRAM requirements.
For a graphics card, fetching data from system memory is a very slow process, compared to having the data in VRAM already. In the past, when someone had low frame rates, the question was usually whether the CPU or the GPU was the bottleneck. Nowadays (and starting with this year in particular), we're seeing situations where neither the CPU nor the GPU can work at full capacity because the bottleneck is the I/O process of shifting data back and forth between VRAM and system RAM. (And we're also seeing cases where a game's graphics engine itself is the bottleneck, but that seems unrelated to Cyberpunk).
In theory, the situation might improve once we see more games using Direct Storage, which can swap data between VRAM and an NVMe SSD drive very quickly. But that technology will also require more VRAM (since the GPU will have to do the texture decompression now), so we'll have to see.
I agree that 12 GB are the minimum if someone wants to play modern triple-A games at ultra settings, even on 1080p. Case in point, a team of crafty Brazilians have created a "homebrew" 3070 with 16 GB of memory, and the performance gain compared to the regular 8 GB variant in modern triple-A games was substantial.
2. push memes and gimmicks over performance
3. proceed to cripple your ability to use those proprietary features
4. ?????
5. profit!
God it's just so easy to make fun of that company. I had no idea frame generation also used a lot of VRAM. It's just mind boggling to me, like if you really cared about those things you'd demand higher VRAM yet somehow the primary market segment ends up being people saddled with totally crippled 70 cards. I heard about that card, it just amazes me how the 3070 probably could've been a great GPU if it just had more VRAM, instead it's going to go down in history as a GTX 770 2gb at best. It's just so weird trying to imagine how this tactic is actually making them money when clearly nVidia is releasing inferior products at this point from it. Maybe something is a wrong assumption, because I thought they tried to sell out their old gen cards first and hated the used market so it's like, selling your 70s as better than a 2080ti isn't that supposed to be a benefit?
It just amazes me how much of a problem this is on Cyberpunk of all things, I truly expected this game to be well optimized for textures because they wanted to sell 3070's and I just cannot imagine how you'd be using one smoothly with raytracing on. Maybe they were only trying to upsell 3080 10gb cards? Because it feels like anything below a 3080 is definitely inadequate for RT in this game.
I also do have DDR4 but I was expecting I guess more stutters when hitting that limit though in fairness maybe my CtD issue (had over a dozen crashes so far) is tied to running out of VRAM and hitting system memory. I know it's still slower but I wonder if DDR5 is more tolerable when swapping, unless the actual I/O bandwidth is limiting.
I have a RTX 3050 Laptop version ... yes it's tough, but with DLSS it runs at High 1080p 60fps pretty much all the time. My Cpu is pretty good actually, it's an i5 12500H, which can be very helpful running this beast, also i got 32GB of RAM running at 4800Mhz, the game usually sits at around 15GB of RAM usage. It's pretty solid and the game looks pretty even at 1080p, DLSS is set to Quality mode and the sharpness is at .20, I got vsync turned on and frame rate capped at 60fps. As Todd Howard says, it just works!
I think, as in dont know but the patch state is unfinished right, so maybe this will be resolved when the DLC launches.
Oh it does, and we have bugged locations like Cherry-blossom market where people have been able to reproduce these issues reliably for years.
Fairly sure you have a mod installed that cause the memory leak.
I've had this happen more with billboards and soda machines, often those that are really near, and sometimes it'll load the rest of the texture. So idk, I had a similar problem like this when I was running The Division on a GTX 980 4gb and it basically, well it's nVidia. So which means you've basically got a hotrod engine strapped to a lawnmower when you go nVidia, ♥♥♥♥ software and stupid thing has problems mainly on account of either bad API or nVidia's bad VRAM as usual, in my case being the 980 works perfectly fine so long as I am not straining it with 1080p ultra or anything that's really pushing those textures. Which sucks, because literally the most important things to how a game looks is textures and lighting, and prior to raytracing going with nVidia basically meant you having to sacrifice gorgeous looks. Now it still does though, because they still don't put enough VRAM, and you need that VRAM for RT too. So I'd basically have still gotten a 16gb Radeon, like at this point I am not even entertaining the idea of ever buying from nVidia again, my Radeon aged beautifully and made me lots of money mining, the software kicks ass, it just werks. But meanwhile my 780 having all kind of problems thanks to its bad API, like I couldn't even get freaking, lowest performance requiring game ever, Warhammer Gladius, which I specifically hoped to be playing on that laptop, because it got switched over to Vulkan now I am not even able to play the new DLCs just because nVidia didn't support Vulkan. And meanwhile on my 980 while it ran The Division perfectly fine, it had such texture popin problems just from the stupid memory config, I swear to God, you know, they HAD 8gb GPUs back then too with AMD.
I just can't believe memevidia is STILL pulling this garbage on gamers with offering us literally half the VRAM we need in order to upsell the stupid halo crap that'll burn your house down and kill your dog like the 4090/3090 if you want VRAM. They literally always put only what is literally the new standard VRAM on the 80ti, which is exactly why the 980ti had 6gb, it's what became 1060 and 1660 and 2060, just the basic standard, just like in all honesty the 1080ti 11gb was the new standard of 10-12gb for super high resolution. Like if I had a 1080ti I'd have the same performance as my 5700XT by its actual GPU die (more or less, 2070S=5700XT=3060) but I'd not be having these texture popin issues. 80 non-ti usually had the bog standard minimum just to not completely age like crap, like the 1080 8gb or 980 4gb but even then, you're still really risking running on the age with nVidia, like the paltry 780ti 3gb like it's not even a bad GPU it's just its VRAM sucks so much, the GTX 770 2gb isn't even playable on most games. Like it runs GTA V just fine, the problem is just that it runs out of VRAM even at 1080p.
So yes in my experience the VRAM issue counts, I had similar problems running Cyberpunk on my 5700XT 8gb @ 1440p as a I did at 1080p/900p running my GTX 980 4gb. Which is unless it actually had the VRAM and there was some other really weird problem with my 980 that made it lag, texture popin, stutter etc. during all ultra benchmarking. Which, you know what I noticed I end up turning nVidia's GimmickTM of the year off half the time anyway like PCSS. They often don't even make good enough hardware to run their own gimmick unless you bought the 80ti and the prices are just nuts. I often feel like I've had fewer headaches dealing with a Radeon than dealing with GeForce stuff.
That's partly the VRAM I swear to God, there's another game that would do that maybe it was The Division 2? I forget which but it fairly reliably crashed after certain playtime I mean I could be wrong, but what I've noticed is just that the game sometimes slows and gets bogged down in stutters after an hour or two and it goes away when I restart the game. This often coincides with texture issues.
4080 16GB
free to test
just saying