Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Cutscenes are locked at 30 ive never gotten below. So that might be a bug. You should be getting more fps than me, but your cpu could be total ass.
Render distance doesnt really matter, that hardly makes a difference for me. What hurts fps the most is like foliage at high settings, at least for me. But then again by "not that bad of a cpu" isnt really something to go off, especially when they say "over the years" could be a 5 year old cpu that was mid tier back on launch xD.
textures on his card being lowered will not make a bit of a difference in this game.
Foliage though, yes I keep that on medium for me other wise it stutters a lot more when loading in the textures of foliage, and in general runs worse
Anyway the game magically revived :)
I recommend turning textures to the highest, with a 1080 you can do that with little to no fps loss. I usually like to tweak all settings to get the best looks, for the least cost, and with a 1070 8gb of vram the only game that textures makes an impact, is red dead redemption to on ultra. So your card should 100% be fine for that. I even use the HD DLC texture pack
Here's how to fix it.
Run GeForce Experience Optimization for the game to set a baseline, and then turn down a few items like "...Quality" and "...Lighting..." - check again.
To improve general image quality and edge tearing, set Vsync to On. (locks to 60 FPS, but really improves the image quality)
I run 1070Ti at 2k and have no problem maintaining 60 FPS
There is a Cutscene problem specific to NVIDIA users (as Vine alluded to).
Easiest fix is to set Graphics to "Windowed - Borderless".
Another fix is to set Vsync to On
Will be controversial, but, do NOT use geforce experience for any game. Spend a little time learning how to tweak your system (part of the PC gaming hobby, right?) and you will get much, much better performance than that bloatware. Search online and you will find plenty of verification.
KCD uses cryengine, so shaders and physics settings are the real perf hogs. At least turn these down before big battles b/c you don't need that eye candy, then up afterwards.
So i guess Geforce experience isn't any better nowadays... ...
I use it for shaders some times, unless the fps hit is too bad. Which recently their DETAILs part of the shader system has gotten less optimized. Use to be max a 4 fps hit, now its up to 10 fps. So I just use reshade.
Also OP should mark this as solved, if they did get it solved. Just so people stop trying to fix it for them, which wont stop many people still not seeing that being solved :^)