Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://steamcommunity.com/app/1091500/discussions/7/3394049806854068522/
Thanks for the suggestion, I checked it out, but the problem is that my monitor is 1440p. Using 1080p would make the game far too blurry even setting DLSS sharpness to 1, so the settings you mentioned wouldn't work too well for me, but I'm grateful nonetheless.
I just can't wrap my head around on this power consumption variation. It's no coincidence. Power consumption drops like rock, and so does the performance, even though the GPU clock is at 2100MHz and GPU Memory Clock is at 7800Mhz. Something is really odd.
I'll try it to tweak it more and see what happens.
2100? That could be your issue right thier ya have it overclocked too much, the RTX 3070 normally runs in the 1800 - 1900 range since its only a 220w card, RTX 3080 would be in the 2000 range since its around 300w
also those settings should work for 2K too it should only be like maybe a 5fps difference between it and 1080p since all your are doing is transfering some of the load from CPU to GPU with the higher resolution
I might try slapping my old 1080p monitor and play around with it with the exact same settings I run, except that instead of 1440p, it will be 1080p. Wonder what kind of result it would yeld.
This issue became much more prominent after 1.6. Before that the performance was fine the vast majority of time with this happening only every now and then. But now it's a lot more common.
Did this ever happen to you at all in any case? Especially on that area near Misty's shop?
RT is like a forbidden fruit, once you experience it and see it in its full glory, it's hard to go back. =)
Thanks for the pointers and shared experience anyway.
I just gave up on RT in this game. It's not worth the hassle, and with the addition of DLAA, I prefer to run native DLAA with everything on high, traditional rasterization. Game looks stunning and runs super smooth.
I wish you luck if you plan on trying to find a solution for that.
Stay well!
I updated the driver just recently to the latest version. At 1st it seemed to had fixed it, but it has come back. I'm trully at a loss and just like you said, disabling ray tracing is the only way to keep playing without restarting the game.
RT increases VRAM usage significantly, hence this is why it only occurs when you use it. You can test my theory by running an extremely aggressive DLSS setting (ultra performance or something) and lowering the texture settings to its lowest. Then try running with every other setting as it was. You will likely not get the framedrops. Because you lowered VRAM usage from the texture settings, and additionally by lowering the internal render resolution to lower the vram usage even more.
3070 was not gifted with sufficient VRAM to prolong its life (i wonder if it's deliberate :thinking:), it will become a bigger problem down the line with new titles easily using 10-12GB on modest settings. The new Forza game uses up to 16GB in 4K now.
EDIT: Just looked with Afterburner. At UW 1440p, with raytracing maxed, im seeing 11.2GB of VRAM usage on my 6800XT 16GB. I'm pretty confident in my assessment that this is your issue.
Shame on Nvidia for intentionally castrating their own cards. The 3070 would be one of the best cards ever made if it had the proper amount of VRAM to support its alledgely RT capabilities.
Best way to run CP2077 on a 3070 @1440p is everything on High with DLAA (not DLSS) and sharpness set at .25. Looks really, really beautiful. RT be damned at this point.
Thanks for posting and sharing!
Yeah ultra performance you're rending at like.. 720p? x'D
So yeah, it'll look horrendous, but it's just to prove the point, not to actually play with.
I dont know how much system memory you have? 16GB? Another post someone has a 3070 too but he has less issues since he has a lot of system memory. I won't tell you to buy something, as I cannot guarantee it will fix it, but should you feel inclined to spend a little money to maybe fix it, you can maybe upgrade it to 32GB. That gives windows and the game a bigger pool is semi-fast memory that could alleviate some of the issues. Because now it's most likely swapping to SSD instead, and that brings the performance to a crawl. I tested it a few minutes and I saw up to 26-27GB of required memory (12.5GB VRAM, 14GB system). So if you only have 8GB VRAM, you need >20GB of system memory, otherwise it will revert down to the next memory heuristic -> SSD.
I can't use DLAA on my AMD GPU. I use FSR quality to give me more fps, as I think super fluid fps adds more immersion/realism than RT does. It also overwrites the ingame TAA solution, which is atrocious. Try swinging a sword and you'll notice how much ghosting there is. I sort of like the RT reflections, but it costs me 30-40fps. Which is unacceptable.
As for Nvidia's planned obscolescence.. sadly they are continuing that trend. They gave the 4070 and 4070TI a 'measily' 12GB. Considering the price, i dont think that statement is harsh. Seeing as the $500 7800XT has 16GB. And the 7900XT vs. 4070TI its even worse, there its 12GB vs 20GB. This is just so that people have to replace their GPU faster. It's a good thing that reviewers like hardware unboxed make a big deal out of it. I think now it is more known to gamers than it was a year ago. That said, the 4070 is still outselling AMD GPU's... so in a year or two from now, someone with a 4070 will be creating the same topic not knowing what is going on.
https://i.imgur.com/Vd4ROiH.png
I've been testing your theory the last couple of hours. I have 8GB of VRAM and "luckly" VRAM never exceeded 7GB and when it did I noticed some small dips in wattage and FPS, but it went back to normal fast. Now I was taking care of an airdrop and right after killing everyone and looting it, I started to go around the corpses to loot some more. That's when the wattage dropped without a reason. VRAM was around the same as you can see in the screenshot, so no real reason for it to go down.
I have 64GB of DDR4 memory and the game is installed in a NVMe PCIE 4.0 drive, system is installed in an SSD. As you see, wattage is around 150 in the screenshot, causing severe FPS loss. Game should be running at around 50 FPS in Dogtown with ray tracing enabled at 1080p DLSS-balanced (it looks almost the same as quality with huge FPS boost) but as you see in the screenshot, the wattage drop is causing the game to run at 30 XD
So again, there's no real reason for wattage to drop like this on the GPU.
This all just makes me sad 'cause it seems it's just the stupid game and nothing we do is ever going to fix this mess.
I agree on everything else though about NVIDIA's bs, which in fact, is probably the main reason this DLC has these issues.
But yeah, it's been an issue since 1.0 version of the game. Certainly diminished quite significantly, but still very present and common. And it's very frustrating indeed. The only 2 things I can relate is that when the power usage drops, the game's performance gets tanked HARD.
It's lik the GPU decides to go AFK for a bio break.