Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You really don't know how much I enjoy doing that. I can't sign their freaking NDA as an Open Source developer, but I see nothing wrong with reverse engineering their stuff -- well, right up until I get a cease and desist. But I've flown well enough under the radar for now not to worry :)
EDIT: Though it might be down to using registry depending on what's exposed in the API itself or the drivers outside of reg tweaking the setting into working order. :)
Ah and I see it's already covered in the opening post that dithering is active under Linux, though the driver is probably a very separate branch and might not have the same features in the Windows OS driver though at least the registry tweak still works for users wanting to utilize dithering.
In Guzz' screenshots you can see the "WMI Explorer", this is part of the NVWMI.
HERE[developer.nvidia.com] is the SDK.
HERE[www.nvidia.com] is the driver\tool kit.
NVWMI requires nView, which is Quadro exclusive.
I'm not sure how Guzz got this working on GeForce cards (somehow spoof a Quadro device ID?).
Once he got it working he noticed those registry entries and simply tried to use them with a GeForce card... which worked. :)
Also as in that Nvidia forum thread, someone mentioned that the linux driver uses some higher bit depth 3D LUT when doing all the signal processing, whereas the Windows version is always clamping to 8 bit and dithers that to whatever you set in the Nvidia control panel.
I can't verify that last part, but I believe it to be true based on observations made over the years and is an issue that has been frequently showing up for years in the MadVR thread at doom9 for10/12-bit display setups.
Since we're talking about API and LUTs, this MIGHT be related to the LUT "clamping" as mentioned above, but unrelated to dithering.
I don't know about other OS, but Windows 10 has some real issues with OS HDR data levels (Nvidia HDR is fine, can't check ATi/Radeon cards). This has existed since the first test builds featuring Windows HDR went up and has never been addressed despite numerous of my reports with data.
I'm not authorized to publicly share the photographs I've taken of this on the office displays that I've been sending to Microsoft (2016-2018 LG OLEDs) nor am I allowed to take the office camera home.... but here are my personal pictures taken with a Note 8, my LG E6 (2016 OLED) in HDR mode makes it hard to capture this.
https://imgur.com/a/b9w3YgO
How I was checking:
I look at a granger rainbow pattern (aka horizontal & vertical gradient patterns) and flip through my display's "color filters" which only shows red green or blue subpixel data, and also turn color/chroma off (to check luminance data without color energy... aka black and white).
I've seen this only with LG displays, as I don't have access to non-LG HDR displays, but a few others have seen the same at doom9. So it's not just my display/calibration or configuration.
It sucked playing FFXV with Windows HDR, I knew it was there and saw it everywhere.
DXGI has a 10-bit gamma ramp with many, many more control points and there's also LUT support, but I don't have as much experience with that part of the driver. It's rarely used, so that usually means it may or may not work :P
Do you use ICC profiles?
Using Windows to manage ICC profiles will result in lower precision than using something like DisplayCAL's profile loader manage them.
I don't know the technical details, but if you're curious you might find why in the code[sourceforge.net].
... linking sourceforge makes me sick now lol.
In any case, there is something happening and no idea why (refering to 10/12 bit depth issues ala doom9 MadVR thread) but everything seems to point towards Microsoft, as using Nvidia API seems to make everything work properly (except where the DWM or whatever has control (anything not running in exclusive fullscreen mode).