Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I would go even further with that statement. I think 8 bit using MadVR (which processes the video at 16 bit) has less banding than a "10 bit" display that is using 10 bit processing.
Well, I don't think, I'm sure of it. As I've done direct comparisons on the same scenes using the built in decoder (doing 10 bits) vs MadVR using only 8 bits in windows.
Some "high end" televisions, such as the Sony OLEDs use 16 bit processing for this reason. And I think Dolby Vision also does higher than 10 bit processing on their content.
I assume you mean expand the color gamut, but that's by definition, not SDR.
Also, don't you think it would suck to render colors way outside of the capabilties of your display? You'll just get a bunch of colors that look the same @ Rec. 2020 unless you get a display that has full Rec2020 support.
I can definitely expand the gamut to Rec 2020, but you won't like the results. I use scRGB for these mods because it doesn't require metadata and its primary color points are exactly sRGB, which all display devices have full coverage of :)
What do you mean by remapping fixes? If you're referring to the WASD <--> Arrow Keys and Escape -> Backspace, those things are already in the released version and you shouldn't be waiting :P
To some extent for some publishers, you might have a point.
However I've done enough of these fixes now to know it's down to developer inexperience. You gotta get the base game ported over before adding extra features. Even getting the base game ported usually has issues.
I'm really nothing special, I just work with 20+ games a year and know common ptifalls. If I could transfer my experience over and not have to modify things in games, I'd do it in a heartbeat :)
But if you're on an NVIDIA GPU, my OSD can show you your GPU's memory bandwidth usage in real-time. The default keybind is Ctrl+Shift+G
10-bit has no intrinsic Wide Color Gamut. It can certainly accomodate a WCG with less banding, but it's also really nice to get enhanced precision within the game's normal colorspace. 10-bit isn't nearly enough precise to cover Rec2020 or DCI-P3 without visible banding, but it really makes things look better in the original sRGB gamut.
It could also be that your display isn't actually capable of processing the signal @30-bits per-color :)
Oh, I use my 40" 4K TV as a desktop. So i'm very familiar with the artifacts of color compression. And I agree, it's not suitable for desktop use (which is why I think this is very signifcant!)
Anyway, I've made comparisons between running 4K 30Hz (12 bit on nvidia color settings) and 4K 60 (using 8 bit, RGB) and the game looks identical from a PQ perspective.
You should give it a try:
Start the game like you normally would at 98Hz on your monitor; then alt+tab and switch the refresh rate to 144Hz and just maximize the game again.
I'm pretty sure you'll prefer the added FPS potential, since the game looks identical (then again, you'd probably have to lower the settings, and the resolution too much for your taste. This game would be great with a working SLI profile)
Oh, It can. On an older windows 10 version (that didn't even support HDR on the desktop) and older nVidia drivers, I had no problem doing 4k 60, using chrome sub-sampling at 12 bits.
Not sure what the reason behind it is, since there's enough bandwidth for 10 bits using color compression. (and again, it used to work fine last year, when the first HDR games started to come out)
Just by glimpsing my monitor's info section, it becomes clear that this game is requesting the highest available refresh rate, which causes the game to slip into chroma subsampling. A refresh rate override may allow proper display.
----
Yeah, that fixed it on my end..
https://imgur.com/a/VP9kGcs
First turn off NVIDIA's stupid highest refresh setting, since that is leading to these problems. Then force an override in Special K to whatever refresh rate you want that still supports 10-bit color without subsampling.
1. Yes, but it's not just for this game
2. 1) sort of answers this :P
I've tested the stuff in various games with differing results. Ideally, when adding HDR to a game you want to keep the UI a constant brightness, and that requires a game-by-game thing. But I haven't even done that here -- the UI and scene both have brightness applied the same way.
The only reason I haven't made progress on this yet is because I was expecting important reading material on Friday. It was delivered yesterday instead :-\ I would have had plenty of time to do this over the weekend.
https://steamcommunity.com/sharedfiles/filedetails/?id=1509912132
Make sure your display is set to full-range and something other than 8-bit color. Because you've managed to lose image detail rather than gain any.
If you load the game in 8 bit. It won't start properly, the highlights will be blown, and banding will be abundant. Try the method I described, It works flawlessly.
Again: If you use the method I describe and load the game in 10, or 12 bit (4k 30Hz for example) then switch to the highest refresh possible, which will be 4k 60Hz over HDMI, by alt+tabing and doing it on the nvidia, or windows control panel. You still get correct HDR, with no banding and with the benefit of no chroma sub sampling.
If there was a way for the game to start displaying HDR properly in 8 bits, my work around wouldn't be necessary.