Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Incorrect. 8-bit with dithering RGB is indistinguishable from a native 10-bit signal. Only the source material needs to be 10-bit. The signal to the display needs to only effectively represent the same values. If you have a full colour resolution RGB or YCbCr444 signal without subsampling, 8-bit with dithering can reproduce all wavelengths of the 10-bit source material.
I played with 8 bit 444 @ 120hz and 10 bit 444@98hz. Color banding is always there.
I am using a monitor with DP. That's why I can go for these settings.
Sure you can "use" it...meaning it won't make your display freak out, BUT it WILL truncate 10-bit data into a 8-bit package...HDR is natively 10-bit so there WILL be an inability to hit HDR's highs... too many get caught into the 444 vs 422 debate without understanding the 444 cutoff with respect to 422 HDR and it's ability to display whites/brightness.
Yeah, none of what you are saying is true yet you are hitting up the people trying to correct you while you state they don't understand the technology. It's hilarious! RGB 8bit with dithering vs YCbCr 422 10 bit is undisguisable to the human eye in regards to color gamut representation. On the other hand 422 cuts the resolution and accuracy down considerably and is totally noticeable hit on fidelity especially in games. RGB 8-bit with dithering will provide superior color resolution and accuracy than YCbCr 422 10 or 12-bit (Chroma subsampling is garbage for games) on any HDR display. Ideally you want to run either RGB 10-bit or YCbCr 444 10-bit but you need a fully compliant HDMI 2.1 setup to do that and displays that can support that are few and far between right now in addition to being expensive. Plus you run into other limitations such as VRR availability (Get it together Sony!). It's really not worth worrying about it right not. Current HDR 10 displays using RGB 8-bit with dithering looks great if calibrated correctly (white point supports around 800 nits HDR which is the default for most games and black point 68-76 and color accuracy in red, green, and blue set with filters).