Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
There's a bunch of settings that should be ticked, e.g. Nvidia Reflex and you shouldn't just crank everything to Ultra+ in details. For 3070(ti) I'd suggest taking a look at that AIO RTX mod, you may have to say bye bye to RTX shadows.
Granted, my Acer monitor does get its panties in a twist during boot when Freesync is enabled and needs a power cycle, so there may be _something_ to this Nvidia approval. But I'll take a bit of annoyance with Freesync over tearing / VSYNC.
Main PC is hooked to LG OLED which is G-sync and Freesync so no problems there.
I don't know jack about optimizing graphics settings so I just let the GeForce Experience app do it for me. Downloaded the latest driver through the app and clicked the button to make it optimize my settings.
I'm CPU bound, or something, my GPU is only running at around %75. I'm thinking this may be the issue some are having. I'm not sure what holding it back. It may be in ARC thing, intel has done a great job , but this game isn't doing well with Nvidia or AMD ether.
I'm getting a little over 60fps at 1440p ray tracing ultra on an ARC A770 16GB (Using the XeSS Mod, FSR looks TERRIBLE!) I hope they add XeSS support soon.
As for PLAYING the game, and not just fiddling with it to see if I can even tell the difference(I can) and if it's much better than raster. In the end , with this game, I just leave it off along with up scaling and play native and never worry about tearing or low FPS or latency, it's pointless to go to all this trouble to give you a MINOR effect that is %50 negated by having to use up-scaling to get an acceptable frame rate.
I have a 3080 Ti and was able to get 60-70fps (1080p) on Ultra +, RT, and Nvidia Hairworks set to all (no vsync). I think the recent patches actually did something because a month or two ago I was barely able to get 30 fps... I did also undervolt my gpu recently and temps rarely go over 70 on Witcher 3 and Ark max settings (so far, have yet to try other games). I did just build another PC with a 4070 that gets much less temps on the same games and settings, but I haven't tested on Witcher 3 yet. I'll give it a test soon and let you know the results.
Seems the 4070 had similar results, 60-70 fps and temps never went over 60 C. My 3080 Ti build has an i9 12900K, the 4070 build has an i7 13900KF, with everything else the same except the 4070 is all DDR5. The DLSS is set to quality as well, not performance.
What I did was to switch my desktop to 144 Hz (the maximum refresh rate of my monitor). When I run Witcher 3 with frame generation active, it usually renders something between 70 and 150 fps on my 4090. But instead of tearing the screen, like it was when it was fixed at 60 Hz, the monitor now adapts the framerate to the rate the images are actually rendered at. So the monitor refresh rate also fluctuates between 70 and 144 Hz according to the screne being rendered, resulting in a crystal clear and extremely smooth, fluent and flicker free image without any tearing. This is the best experience I've had with Witcher 3 so far.