Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The only way I can get mine to run is by putting -d3d12 in the command line, run the game in Windows 8 compatability mode, and crank everything WAY down.
Hogwarts had this problem too. It's an Unreal issue, and devs keep walking into it. I have no idea why, no idea how this is such a hard fix. Oodle is a PS5 shader decompression or something along those lines, so they were basically leaving portions of the game ported to PC using PS5 code. Not sure if that's standard practice these days or what, but for some unearthly reason Remnant II absolutely hates the 13900k.
http://www.radgametools.com/oodle.htm
Note, Oodle was implemented into UE4 a while ago:
https://docs.unrealengine.com/4.27/en-US/TestingAndOptimization/Oodle/
And is still there in UE5 which is what R2 use:
https://docs.unrealengine.com/5.0/en-US/using-oodle-in-unreal-engine/
From what I could see on this forum, vast majority of reported "Could not decompress shader" errors come from 13900K/S users.
So perhaps the problem is the big.LITTLE architecture, or perhaps these CPUs are notoriously unstable unless fine-tuned (and on some boards, these CPUs are likely running out-of-spec).
I don't think any amount of tuning would help, as I attempted both manual stable overclock, the generic boost overclock that MSI includes in their bios, and no overclock. None of the settings worked for me.
Tried turning off my RAM's OC as well, and tried tweaking my 3080ti's by removing my undervolt. No combination of any really helped at all.
The only thing that managed to get me into the game was Win8 compatability mode, and the -d3d12 command prompt line. Until I had both of those, it wouldn't even boot up after this patch.
Before the patch, I was able to boot just using the Win8 compatability work around.
You should read intels latest statement on this exact issue, they even tell you what your optimal profile in bios should be to avoid crashing.
Link please ?
https://www.anandtech.com/show/21389/intel-issues-official-statement-regarding-14th-and-13th-gen-instability-recommends-intel-default-settings
Also: Told ya, long before this got into mainstream media.
Circling back to "but it's only in THIS game" isn't really going to help you in any way, much less when you already complained about overheating issue during Nightingale test round.
If that's not a dead giveaway of a hardware failure then i don't know what is.
post was made under wrong game sorry, Overheat issue in Nightingale was due to ICUE app update not game. I will fix that post. thanks
Shader decompression issue in cpu under Greyzone. was were i intended for the post.. not sure how it got hear. sorry. Remnant II works great for me!