Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
0.0001% of peoples
Anyone who isn't a poor with a monitor from 2003.
What GPU are you running that can push this game higher than 4K 144hz OLED specs? Because even my 4090 is struggling to keep above 100fps without dlss at 4K. The only way there even might be a tradeoff between refresh rate and HDR is if you were running a 5090 cranking out fake frames with 4x frame gen. Or you just turn the settings to low, nullifying the point of the remake.
The answer is in your question. People aren't running 4K in the first place because they prefer performance over eye candy. This is one of those games where that actually matters. That's the entire reason why people go for 1440p monitors instead even if they do have decent hardware.
unless you have oled or full array dimming.. which most gaming monitors does not have; HDR is a gimmick and looks worse than no HDR;
even if you do have oled.. it's hard to tune the image quality for HDR.. imagine a gaming company doing it.. its bad.
All I can speak of is from my personal experience as I have been comfortably managing 120fps ultra 4k with DLSS Quality on my 5090 (4k 120hz OLED display) and for me it's become the the optimal way to play the game.
The graphics look pretty damn good and when I compared to native, I could barely tell the difference. Honestly, the trade off didn’t seem worth it to me, I couldn’t consistently hit 120 FPS on native and with DLSS enabled, the visual fidelity remained nearly identical.
When it comes to the in game HDR implementation, I agree that it could have been done a bit better in this case. That’s where Nvidia filters come in handy, I simply disable the in-game HDR and use the RTX HDR filter through the Nvidia overlay instead.
The difference is significant and if offers a much better HDR experience than the in game option in most games. If you have a supported Nvidia card, I highly recommend this approach. Not only does it greatly enhance HDR visual quality but it also gives you far more control to fine tune it exactly how you want and have it saved as a preset.
I actually prefer this method in most games, even when they have a decent HDR implementation. Since my preset filters are already saved and configured, it’s incredibly easy to apply them across multiple games.
Yeah HDR makes a significant difference for me as well, I couldn't ever imagine not using it on an OLED display. Also you can now even have RTX HDR and quality upscaling on if you just want to use it for just watching non hdr Youtube videos or whatever else as well.
I also never previously thought either that HDR was any good until I started using an OLED display but if I ever turn HDR off I can immediately tell the difference.
you think it makes a difference because brightness is cranked up.. but its bad. if your monitor is IPS any light bleed is now 100% worse.. there's too few dimming zones so flicker.. and low native contrast.