Special K - "Kaldaien's Mod" Special☆K
Special K - "Kaldaien's Mod" Special☆K
May 23, 2016
All Discussions > Development > Topic Details
Kamikaze Ice Jan 14, 2019 @ 10:28pm
Dithering working on GeForce class cards
I know this doesn't really have anything to do with Special K, but I figured some of you would be interested in this none the less. I just stumbled across this during my yearly "does we has dither" check at Nvidia forums.
IT WORKS. Finally. First Tales of Vesperia, now driver dithering.

Before I start I'd like to state that dithering is a GOOD thing... when done properly.Properly will not be visible at any practical and realistic viewing distances (visible just before you can see the screen door effect[], but no one sits that close).
Also, this is not the same thing as dithering via reshade, it's more comparable to MadVR's Error Diffusion + deband low but without that performance hit.

*Pre-posting edit*
I wrote up a lenghty soap box post trying to explain what proper dithering is, again emphasis on proper, and why it's never a bad thing.
Luckily for those that read this, I scrapped it. :)

If you're really interested in it I encourage you to research it yourself (one resource: avsforum).

So, dithering has been in the Nvidia's Linux drivers for a long time for Quadro and GeForce cards, and for Windows drivers only for the Quadro cards (ATi/AMD cards have had dithering since AGP card era).
Users have been asking Nvidia to enable this for GeForce cards for YEARS. Earlier in 2018 we finally got a response from ManuelG, an Nvidia rep on guru3d forums that I'll quote below.

Originally posted by ManuelG:
Thank you for your feedback. Monitors from the past 5-7 years will all accept 8bpc input, even TN panels. Adding temporal dithering toggle for Windows would not help any current monitors. At this time we do not have plans to add this feature. I am sorry for the inconvenience this may cause.
His post: LINK[]

Emphasis mine. This is simply not true.
It does not matter what bit depth your display is or nor the depth of what is sent to it.
All displays do their own dithering, it's integral in how displays internally process the signal it receives.
Varies by display but typically the chain goes something like this:
Signal received > Colorspace conversion > Legal/Data expansion > Display's internal tone response curve decodes gamma/EOTF (signal becomes linear light/perceptual/equal energy) > *internal processing happens (applying brightness/contrast/etc) > inverse tone response curve applies (gamma/EOTF returns to normal) > convert to RGB > dither down to panel hardware capability (bit depth per color, not pixel) > controller > what's on screen
*Always at a higher bit depth than the signal and/or panel hardware.

In short, all displays dither but not all do it properly for various reasons (ex: TN panel displays).
You will either benefit from this, or see no difference at all.
If you see no differences with any of the options then congratulations your display does dithering proper.
Now any banding/contouring/posterization you may see is either caused by your displays calibration (common for TVs and budget/entry level PC monitors) or most likely in the source content.

There is no "best". I know it's tedious but you really should try all 8 and 10 bit options to find what works best for your display.

In my case my LG E6 (2016 oled) does not handle 12-bit signals well (or 4:2:2 signals), so I keep it at 8-bit, so all source content with banding is now very obvious.
I must use PC Icon mode as it's the only way to have 4:4:4 chroma and lowest input latency with full calibration controls (not available with game modes, and this is just as low).
I have my own calibration equipment.
What works best for me is Temporal and dynamic 2x2.

All credit to Guzz, user on Nvidia's forum, whom I shall now quote

Originally posted by Guzz:
I managed to unlock all classes in NVWMI, so I checked how dithering works.
It's not true that dithering is completely disabled, by default the driver control dither state and it depend from output color depth, color format and dynamic range.
For example, with "Full" dynamic range dithering is disabled, but with "Limited" range dithering is enabled with "Temporal" mode.

I was right about "DitherRegistryKey", this key is control dithering.

How to enable/disable dithering:

Export your registry settings from:



Add into your exported registry file one of the following setting:

ditherState - Enabled; ditherBits - 6 bit; ditherMode - SpatialDynamic

ditherState - Enabled; ditherBits - 6 bit; ditherMode - SpatialStatic

ditherState - Enabled; ditherBits - 6 bit; ditherMode - SpatialDynamic2x2

ditherState - Enabled; ditherBits - 6 bit; ditherMode - SpatialStatic2x2

ditherState - Enabled; ditherBits - 6 bit; ditherMode - Temporal

ditherState - Enabled; ditherBits - 8 bit; ditherMode - SpatialDynamic

ditherState - Enabled; ditherBits - 8 bit; ditherMode - SpatialStatic

ditherState - Enabled; ditherBits - 8 bit; ditherMode - SpatialDynamic2x2

ditherState - Enabled; ditherBits - 8 bit; ditherMode - SpatialStatic2x2

ditherState - Enabled; ditherBits - 8 bit; ditherMode - Temporal

ditherState - Enabled; ditherBits - 10 bit; ditherMode - SpatialDynamic

ditherState - Enabled; ditherBits - 10 bit; ditherMode - SpatialStatic

ditherState - Enabled; ditherBits - 10 bit; ditherMode - SpatialDynamic2x2

ditherState - Enabled; ditherBits - 10 bit; ditherMode - SpatialStatic2x2

ditherState - Enabled; ditherBits - 10 bit; ditherMode - Temporal

ditherState - Default

ditherState - Disabled

Import your new registry settings
Reboot or restart video driver using restart64.exe - ToastyX Custom Resolution Utility[]
His post: LINK[]

Descriptions as per Nvidia Linux documentation:

Static-2x2: A 2x2 dithering pattern remains constant throughout the frames
Dynamic-2x2: 2x2 dithering pattern is updated for every frame
Temporal: A pseudo-random dithering algorithm.
Last edited by Kamikaze Ice; Jan 14, 2019 @ 10:33pm
< >
Showing 1-5 of 5 comments
Kaldaieℵ₀ Jan 14, 2019 @ 10:44pm 
Hmm..... any chance there's an actual portion of NvAPI that covers this stuff? I feel weird with silly registry tweaks. But I wouldn't mind reverse engineering the NDA-protected NvAPI library, hahahahahah.

You really don't know how much I enjoy doing that. I can't sign their freaking NDA as an Open Source developer, but I see nothing wrong with reverse engineering their stuff -- well, right up until I get a cease and desist. But I've flown well enough under the radar for now not to worry :)
JBeckman Jan 15, 2019 @ 1:57am 
Linux supports it and I think the Quadro cards has support for it (And some other things.) so there should be some method they use to toggle dithering in that regard though it's one of those weird things between AMD and NVIDIA I suppose with NVIDIA not doing dithering at least for Windows desktop and then there's complaints around this since it has it's uses and then AMD always dithers and then there's complaints for that since disabling it also has it's uses. :D

EDIT: Though it might be down to using registry depending on what's exposed in the API itself or the drivers outside of reg tweaking the setting into working order. :)

Ah and I see it's already covered in the opening post that dithering is active under Linux, though the driver is probably a very separate branch and might not have the same features in the Windows OS driver though at least the registry tweak still works for users wanting to utilize dithering.
Last edited by JBeckman; Jan 15, 2019 @ 2:05am
Kamikaze Ice Jan 15, 2019 @ 6:21pm 
I don't know. Perhaps this will help?
In Guzz' screenshots you can see the "WMI Explorer", this is part of the NVWMI.
HERE[] is the SDK.
HERE[] is the driver\tool kit.
NVWMI requires nView, which is Quadro exclusive.

I'm not sure how Guzz got this working on GeForce cards (somehow spoof a Quadro device ID?).
Once he got it working he noticed those registry entries and simply tried to use them with a GeForce card... which worked. :)

Also as in that Nvidia forum thread, someone mentioned that the linux driver uses some higher bit depth 3D LUT when doing all the signal processing, whereas the Windows version is always clamping to 8 bit and dithers that to whatever you set in the Nvidia control panel.
I can't verify that last part, but I believe it to be true based on observations made over the years and is an issue that has been frequently showing up for years in the MadVR thread at doom9 for10/12-bit display setups.

Since we're talking about API and LUTs, this MIGHT be related to the LUT "clamping" as mentioned above, but unrelated to dithering.
I don't know about other OS, but Windows 10 has some real issues with OS HDR data levels (Nvidia HDR is fine, can't check ATi/Radeon cards). This has existed since the first test builds featuring Windows HDR went up and has never been addressed despite numerous of my reports with data.

I'm not authorized to publicly share the photographs I've taken of this on the office displays that I've been sending to Microsoft (2016-2018 LG OLEDs) nor am I allowed to take the office camera home.... but here are my personal pictures taken with a Note 8, my LG E6 (2016 OLED) in HDR mode makes it hard to capture this.

How I was checking:
I look at a granger rainbow pattern (aka horizontal & vertical gradient patterns) and flip through my display's "color filters" which only shows red green or blue subpixel data, and also turn color/chroma off (to check luminance data without color energy... aka black and white).

I've seen this only with LG displays, as I don't have access to non-LG HDR displays, but a few others have seen the same at doom9. So it's not just my display/calibration or configuration.
It sucked playing FFXV with Windows HDR, I knew it was there and saw it everywhere.
Kaldaieℵ₀ Jan 15, 2019 @ 8:00pm 
That's not entirely accurate. Beginning with Windows Vista, we got a 10-bit LUT as a replacement for the old 16-bit gamma ramp. That sounds like a downgrade, but the fact of the matter is the gamma ramp exposed by GDI _was_ 16-bit, but nothing could effectively use that range.

DXGI has a 10-bit gamma ramp with many, many more control points and there's also LUT support, but I don't have as much experience with that part of the driver. It's rarely used, so that usually means it may or may not work :P
Kamikaze Ice Jan 16, 2019 @ 3:37pm 
I think I know what the "clamping" statement was referencing now--how some are using ICC profiles.

Do you use ICC profiles?
Using Windows to manage ICC profiles will result in lower precision than using something like DisplayCAL's profile loader manage them.
I don't know the technical details, but if you're curious you might find why in the code[].
... linking sourceforge makes me sick now lol.

In any case, there is something happening and no idea why (refering to 10/12 bit depth issues ala doom9 MadVR thread) but everything seems to point towards Microsoft, as using Nvidia API seems to make everything work properly (except where the DWM or whatever has control (anything not running in exclusive fullscreen mode).
< >
Showing 1-5 of 5 comments
Per page: 15 30 50

All Discussions > Development > Topic Details