Cyberpunk 2077

Cyberpunk 2077

View Stats:
UncleZedz Dec 9, 2020 @ 11:06pm
HDR10 PQ or hdr10 srgb?
Whats the difference between them, thanks.
< >
Showing 1-15 of 23 comments
UncleZedz Dec 9, 2020 @ 11:09pm 
.
Dionysus 🐭 Dec 9, 2020 @ 11:21pm 
Wondering the same thing. They are obviously different standards, but i'm very interested in what the effective difference is.
korefuji Dec 9, 2020 @ 11:22pm 
Yeah I'd like to know too :D
korefuji Dec 9, 2020 @ 11:27pm 
Perhaps the least well-known HDR format, this is basically HDR10 without metadata. In other words, simply ST.2084 Perceptive Quantizer (PQ for short), with 10-bit color depth. How many devices support this is unknown, as almost no TV manufacturer mentions it, but in principle every device supporting HDR10 must be able to handle PQ10, since the metadata was optional in the first place.

idk what the second hdr10 mode diff is though
Big Boss Dec 9, 2020 @ 11:28pm 
The colors quality. Try both. Pq seems to me more pleasing though
MercyGG Dec 9, 2020 @ 11:31pm 
It's the dynamic color range
korefuji Dec 9, 2020 @ 11:33pm 
pq is more accurate it seems? Based on a google, whereas the other one is accurate to a point, but uses enhancements to give a more pleasing look. Not 100% on that though
Same i see very little difference. I'm using an LG38GL950. I do prefer HDR on, than off.
Kaldaien Dec 10, 2020 @ 8:31am 
2
Originally posted by korefuji:
Perhaps the least well-known HDR format, this is basically HDR10 without metadata. In other words, simply ST.2084 Perceptive Quantizer (PQ for short), with 10-bit color depth. How many devices support this is unknown
Allow me to chime in here :)

Probably >= 75% of games in existence render HDR that way.

Most poke a hole through the DWM using NvAPI and use what NVIDIA refers to as HDR10 Passthrough. Metadata's not used when the do that, metadata is not particularly useful for rendering since you can get the display's range by reading its EDID.

---

As for the second format, that's 16-bit color.

It's got the potential for higher quality since it leaves the image in > 10-bpc up until scan-out, when the driver applies PQ EOTF at whatever depth you happen to be running the DWM at.

scRGB is the only way to make use of signals with more than 10-bpc. Setting the desktop to 12-bpc and expecting HDR to make use of those extra bits is folly unless a game has scRGB support.

Off the top of my head, these are the countable list of games that support scRGB:

+ Mass Effect: Andromeda
+ FFXV
+ FarCry 5
+ AC: Valhalla
+ Cyberpunk 2077

===
( + ) Control can also render in scRGB HDR if you use Special K.
Last edited by Kaldaien; Dec 10, 2020 @ 8:38am
Kaldaien Dec 10, 2020 @ 8:35am 
The whole thing is summed up in fewer words on my own forum ;)

https://discourse.differentk.fyi/t/topic-free-mega-thread-v-1-11-2020/79/3746?u=kaldaien
korefuji Dec 10, 2020 @ 9:03am 
Originally posted by Kaldaien:
Originally posted by korefuji:
Perhaps the least well-known HDR format, this is basically HDR10 without metadata. In other words, simply ST.2084 Perceptive Quantizer (PQ for short), with 10-bit color depth. How many devices support this is unknown
Allow me to chime in here :)

Probably >= 75% of games in existence render HDR that way.

Most poke a hole through the DWM using NvAPI and use what NVIDIA refers to as HDR10 Passthrough. Metadata's not used when the do that, metadata is not particularly useful for rendering since you can get the display's range by reading its EDID.

---

As for the second format, that's 16-bit color.

It's got the potential for higher quality since it leaves the image in > 10-bpc up until scan-out, when the driver applies PQ EOTF at whatever depth you happen to be running the DWM at.

scRGB is the only way to make use of signals with more than 10-bpc. Setting the desktop to 12-bpc and expecting HDR to make use of those extra bits is folly unless a game has scRGB support.

Off the top of my head, these are the countable list of games that support scRGB:

+ Mass Effect: Andromeda
+ FFXV
+ FarCry 5
+ AC: Valhalla
+ Cyberpunk 2077

===
( + ) Control can also render in scRGB HDR if you use Special K.


thanks for the detailed explanation. It's very much appreciated
dberger.online Dec 22, 2020 @ 7:04am 
Originally posted by UncleZedz:
Whats the difference between them, thanks.
I have desktop set to 4:4:4 12bit.....hdrpq looks awesome i would say perfect. unlike valhalla washed out. if i choose hdrscrgb it is indeed washed out and looks awful....i have oled b9.....any thoughts.???
Master Zone Dec 22, 2020 @ 7:05am 
hdr10 srgb has more colours.
I THINK...

HDR10 PQ is for if you want more "Picture Quality" (like a cinematic setting for colorization)

And

HDR scRGB is more for if you want vibrant, colors with depth.
Dionysus 🐭 Jan 3, 2021 @ 9:21am 
Originally posted by Munch the Ghoul:
I THINK...

HDR10 PQ is for if you want more "Picture Quality" (like a cinematic setting for colorization)

And

HDR scRGB is more for if you want vibrant, colors with depth.
scRGB is technically more accurate because it uses full color depth (16 bit), and then compresses the output. PQ does the entire process at 10 bit, making it less accurate. This also makes scRGB take a bit more performance. But in reality, you won't notice a picture quality difference even with a device higher than 10 bit. You should only use scRGB if you notice banding.

Originally posted by dberger.online:
Originally posted by UncleZedz:
Whats the difference between them, thanks.
I have desktop set to 4:4:4 12bit.....hdrpq looks awesome i would say perfect. unlike valhalla washed out. if i choose hdrscrgb it is indeed washed out and looks awful....i have oled b9.....any thoughts.???
What are your TV settings and in-game settings at? And what's the black level on your TV set at? And are you using a certified HDMI 2.0 cable?
< >
Showing 1-15 of 23 comments
Per page: 1530 50

Date Posted: Dec 9, 2020 @ 11:06pm
Posts: 23