Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://youtu.be/hnZSv48pUh8
R9 3950X - 32GB DDR4 3600 - RTX 2080 TI @3440X1440 - 980 PRO 1TB
Most games are designed for console at first. Demo runs flawless with everything at max for me.
btw, wtf is fidelityfx CACAO? What kind of bird comes up with this name.
I don't think it's CPU bound, I think there's something else going on. I do see lower GPU utilization around those areas as well, even when CPU utilization isn't super high.
I'm also not seeing any massive drops like you were talking about above, aside from a single load-zone drop when you first round the corner and there's that shack to the left with the enemy in it. My assumption is this is when the game decides to load the village ahead, and that's why it takes a huge dive.
Outside of that load-zone, the lowest FPS I've seen is around 100fps.
Maybe the game is cache/memory intensive? That would explain why my 5800X3D isn't seeing as many issues as your 13900K even though the 13900K is a much better CPU otherwise.