STAR WARS Jedi: Fallen Order™

STAR WARS Jedi: Fallen Order™

View Stats:
Hito Nov 22, 2019 @ 12:44am
HDR: 8bit+Dither or 10bit+422
Hi y'all - still only my 3rd or 4th game with HDR... and while I think I understand the theory and mechanics of what's happening - I want to know at the end of the day what you have all found to be better on PC!

I have a GTX 980ti connecting via HDMI 2.0 cable to my KS8000 4k 10bit Samsung TV.

The way I understand, I get HDR working with a combo of game settings, Nvidia Control Panel (NVCP), and windows settings. For all configs, Windows set to enable HDR and in-game settings slider to HDR ON. For NVCP specific configs:
1. RGB Full / 8 Bit / 444 / 4k / 60 FPS [Dithering is automatically implemented at this point I assume?]
2. Ycbcr Limited / 10 Bit / 422 / 4k / 60 FPS

For those versed in HDR, what have you found to be better for this game or most games in general? I can tell how gross text becomes on my PC going to 422 for the ycbcr, but in game everything looks very vibrant (albeit kinda muddy, like a painting without sharp edges). With RGB it seems like the colors aren't as great, but resolution appears more crisp (probably because 444).

Your thoughts?
< >
Showing 1-9 of 9 comments
TopperHarley Nov 22, 2019 @ 3:49am 
I find RGB FULL 8bpc in Nvidia the best looking . I have a 50" LG 4k 7450 tv and I always have to tinker picture settings in Game Mode for every game. Bit of a pain but it's worth it
Khodeus™ Nov 22, 2019 @ 4:14am 
IMHO, still looking better without HDR.
Cotivity Nov 22, 2019 @ 9:33am 
The most important thing with HDR that people don't understand is that it NEEDS 10bit 422...You can use 8bit 444 BUT you'll be limited in how bright it can reach thereby negating much of what makes HDR shine (pun intended). If you have a display capable of 12bit, you can get back some of the color richness lost with 422 as a half measure. In summary, use 10/12bit 422.
Monstieur Jul 11, 2020 @ 4:15am 
Originally posted by Cotivity:
The most important thing with HDR that people don't understand is that it NEEDS 10bit 422...You can use 8bit 444 BUT you'll be limited in how bright it can reach thereby negating much of what makes HDR shine (pun intended). If you have a display capable of 12bit, you can get back some of the color richness lost with 422 as a half measure. In summary, use 10/12bit 422.

Incorrect. 8-bit with dithering RGB is indistinguishable from a native 10-bit signal. Only the source material needs to be 10-bit. The signal to the display needs to only effectively represent the same values. If you have a full colour resolution RGB or YCbCr444 signal without subsampling, 8-bit with dithering can reproduce all wavelengths of the 10-bit source material.
Last edited by Monstieur; Jul 11, 2020 @ 4:15am
Cotivity Jul 13, 2020 @ 12:44am 
8-bit, 10-bit, 12-bit have zero applications then, huh? You CLEARLY don't understand the technology then. You're quite out of your depth here...
Peh Jul 13, 2020 @ 1:00am 
You can use 8 bit with HDR enabled just fine. You still reach the highlights in brightness. What you notice is the color banding. But so far, no game I tried supports 10 bit color garmut.

I played with 8 bit 444 @ 120hz and 10 bit 444@98hz. Color banding is always there.

I am using a monitor with DP. That's why I can go for these settings.
Last edited by Peh; Jul 13, 2020 @ 1:05am
Cotivity Jul 13, 2020 @ 11:55pm 
Originally posted by Peh:
You can use 8 bit with HDR enabled just fine. You still reach the highlights in brightness. What you notice is the color banding. But so far, no game I tried supports 10 bit color garmut.

I played with 8 bit 444 @ 120hz and 10 bit 444@98hz. Color banding is always there.

I am using a monitor with DP. That's why I can go for these settings.

Sure you can "use" it...meaning it won't make your display freak out, BUT it WILL truncate 10-bit data into a 8-bit package...HDR is natively 10-bit so there WILL be an inability to hit HDR's highs... too many get caught into the 444 vs 422 debate without understanding the 444 cutoff with respect to 422 HDR and it's ability to display whites/brightness.
L337fool Apr 19, 2021 @ 3:23pm 
Originally posted by Cotivity:
Originally posted by Peh:
You can use 8 bit with HDR enabled just fine. You still reach the highlights in brightness. What you notice is the color banding. But so far, no game I tried supports 10 bit color garmut.

I played with 8 bit 444 @ 120hz and 10 bit 444@98hz. Color banding is always there.

I am using a monitor with DP. That's why I can go for these settings.

Sure you can "use" it...meaning it won't make your display freak out, BUT it WILL truncate 10-bit data into a 8-bit package...HDR is natively 10-bit so there WILL be an inability to hit HDR's highs... too many get caught into the 444 vs 422 debate without understanding the 444 cutoff with respect to 422 HDR and it's ability to display whites/brightness.

Yeah, none of what you are saying is true yet you are hitting up the people trying to correct you while you state they don't understand the technology. It's hilarious! RGB 8bit with dithering vs YCbCr 422 10 bit is undisguisable to the human eye in regards to color gamut representation. On the other hand 422 cuts the resolution and accuracy down considerably and is totally noticeable hit on fidelity especially in games. RGB 8-bit with dithering will provide superior color resolution and accuracy than YCbCr 422 10 or 12-bit (Chroma subsampling is garbage for games) on any HDR display. Ideally you want to run either RGB 10-bit or YCbCr 444 10-bit but you need a fully compliant HDMI 2.1 setup to do that and displays that can support that are few and far between right now in addition to being expensive. Plus you run into other limitations such as VRR availability (Get it together Sony!). It's really not worth worrying about it right not. Current HDR 10 displays using RGB 8-bit with dithering looks great if calibrated correctly (white point supports around 800 nits HDR which is the default for most games and black point 68-76 and color accuracy in red, green, and blue set with filters).
Last edited by L337fool; Apr 19, 2021 @ 3:24pm
GnomeToys Jan 29, 2022 @ 3:27pm 
Old thread, but if you really want to compare and you're getting the YCbCr 4:2:2 10/12 bit options at 4k 60Hz, drop the refresh rate of the TV to 30Hz and the higher bit depth full range 4:4:4 RGB options should become available. 30Hz isn't going to be playable in most types of games but you'll be able to see any difference. On some setups RGB 4:4:4 10bit might become available at 50Hz. Alternately you can drop the display resolution to 1080p and all color options should be available and HDR remains active. On my 2020 LG it also allows 120Hz and 10/12 RGB to be used. If you don't care as much about the resolution or the video card can't really handle 4k HDR @ 60Hz anyway and you're just going to be gaming at lower resolution that might be a better option.
< >
Showing 1-9 of 9 comments
Per page: 1530 50

Date Posted: Nov 22, 2019 @ 12:44am
Posts: 9