Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
edit: i was using Dlss 3
Make sure you're up to date with drivers and BIOS (13/14th gen is wonky). Do you have proper cooling? That hardware can get pretty toasty. Install some thermal monitoring software to see if you're not throttling. Do you have anything running in background (browser, discord or any other heavy soft like that)? If you're on a 4K monitor try lower upscaler preset.
I had horrible fps after leaving the game running for a while.
There's a big memory leak, so the longer you play the worse it'll get.
Do all your updates and restart the comp
Weird, I left mine running, went to bed and came back hours later and kept playing all day without any memory issues at all.
Non-stop 60FPS locked for my 4K with intel XeSS at Quality & everything graphically at max settings.
I ran the game for 6H straight with no drop in fps. 4090, i9-13900KF, 64GB.
Are you sure it's a memory leak?
I agree. My 4090 was being hit quite hard for 6H straight and was at a temp of 66-72oC. FPS remained 90-120 at 3440x1440 with DLSS. I capped the game at 120fps via Nvidia control panel as I game on a 100Hz monitor.
I think the game hits the GPU harder than it should, (a lot of other modern titles keep my card in 60s for temp), but no drop in fps.
3090/Ryzen9 3900X. But also really fast frame rates are not necessary in this story telling kind of game. Definitely no memory leaks though. Stable all the way through.
I started passing on games with a 2070 Super 2 years ago man, it's just how the tech is, it's gotten super bloated in demands.
I do feel ya, though. I went into debt to just catch up, and already these games are just at the max settings currently for me at 4K, in a few years, not likely..
Huh? I'm playing games just fine with my 2070 Super. I think Dragon's Dogma 2 and this are the only games that had such ridiculously demanding specs that I had to pass. Are you one of those who need their games to be 120fps/4k?
If you wanna run 720 or lower with scaling-1080P it's perfectly fine, I do not.
I'm barely getting it to run SIXTY FPS at 4K currently, with scaling most often in maxed out modern titles.
120 is for 1440P rigs today, even the most extreme 4090 rigs can't fully maintain 120 at 4K, and it's because nothing keeps up with the tech.
Yes, this tracks with my experience. I game at ultrawide 3440x1440. Greater than 2k but not quite 4k and I do manage to get a good fps (120+ in a lot of titles, but not always and not in this game SURPRISINGLY) with max settings on a 4090, DLSS etc.
I've been gaming at 3440x1440 for over a decade and upgraded from a 1080Ti so obviously am incredibly impressed with the performance boost in games, but it soon dawned on me... "hang about... I don't think I could game at max settings with amazing frames if this was 4k... damn!".
Ray tracing seems to be the latest tech that is hitting hardware hard, and it definitely improves visuals loads, but this whole situation of tech vs hardware begs the question - how much more can tech improve realism in games? We surely have gotten to some sort of pinnacle now - some games look almost realistic at times, so how much more realism is there to achieve?
Surely the hardware will start catching up and stay caught up? Or maybe I'm clueless. After all - no more new tech would mean there would be no more need for new hardware - can't imagine the GPU vendors being happy with that!