Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
your assumption is incorrect, Blade's assumption is correct, setting HDR on in the launcher, and all the various blahblahHDRblah .ini directives just reduces banding and improves color accuracy, it's still low-gamut... not just HDR monitors, all monitors. it's low-gamut content any display can display.
community shaders doesn't change the render API, it's still 24-bit. for true HDR10 / HDR12, you have to set the game to use an HDR enabled DXVK .dll (DirectX to Vulkan instruction translation), and have an Nvidia card... it might work on AMDs too these days, don't own any... you'll also need to set Windows Game Mode to trip the AutoHDR function, apparently. there's various guides for it, check say: https://performance.moddinglinked.com/falloutnv.html#DisplayModeComparison