Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Because its not that great. Many, MANY times I've seen devs implement HDR that looks worse than SDR or makes the game difficult to play, or has no point where it can actually be calibrated to look decent. And with no real "standard" everyone has a different monitor, and there can even be differences in the same model from panel to panel, and they put work into something that many people just disable. There are games that have awesome HDR, but there are probably just as money that HDR makes the game look completely broken visually. So, it can be hit or miss for many people, and those that DO want it can just enable it anyways without R* putting it in with Windows HDR or RTX HDR. Honestly, I have several games where I disable the native HDR and use a third party implementation anyways, because the games own HDR sucks.
rtx hdr needs the nvidia overlay to work at all (which can cause problems with some games), and costs upwards of 20% performance
auto hdr isnt as good and requires that its turned on globally in windows, which can and will mess up other games if you forget to turn it off after playing this one game (very likely)
It also has a TON more advanced settings to mess with than RTX HDR if something doesn't look right.