ˢᵈˣ FatCat (Bloqueado) 6 SEP 2024 a las 7:24
A question about 8bit fcr and 10bit
So i have 8bit FCR monitor, and yet my amd driver can set up to 10bit display, and my windows detect my display is 10bit

So the question is : should i use 8bit or pick 10bit?
Can wrong setting display bit can make crash?
Publicado originalmente por DevaVictrix:
It probably just means the monitor can accept a 10bit signal but won’t be able to display as 10bit. My monitor is the same. No adverse affect by setting it to 10bit, for me.

…like those 720p HDReady tvs from a few years back. You can give them a 1080p image and the tv does some processing before displaying at 720p.
< >
Mostrando 1-11 de 11 comentarios
A&A 6 SEP 2024 a las 7:44 
Use 8bit
It shouldn't cause crash, but can cause output of slightly incorrect colors.
El autor de este hilo ha indicado que este mensaje responde al tema original.
DevaVictrix 6 SEP 2024 a las 8:20 
It probably just means the monitor can accept a 10bit signal but won’t be able to display as 10bit. My monitor is the same. No adverse affect by setting it to 10bit, for me.

…like those 720p HDReady tvs from a few years back. You can give them a 1080p image and the tv does some processing before displaying at 720p.
Última edición por DevaVictrix; 6 SEP 2024 a las 8:22
ˢᵈˣ FatCat (Bloqueado) 6 SEP 2024 a las 8:25 
Even with 8bit the color so beautiful, imagine with 10 bit,
Anyway thank you
DevaVictrix 6 SEP 2024 a las 8:30 
I’ve never been blessed with viewing a 10bit monitor but I’d imagine unless you have a specific need for 10bit the most noticeable difference would be when viewing a gradient image… the sky or something.
Última edición por DevaVictrix; 6 SEP 2024 a las 8:30
Midnight Aurais 7 SEP 2024 a las 4:58 
10 bit is mostly for hdr and color accurate work i myself use a tv that runs at true 12 bit i can actually slightly like really slightly notice a difference between 8 bit and 12 bit when watching content and playing games in hdr

it is more if you have it lovely if you don't its not a huge susceptible difference

also if you have a display with 8 bit with dithering it emulates 10 bit enough were its very difficult to find the difference even while it is there

the majority of people won't see a improvement from 8 bit to 10 let alone the dithered version heck a lot of people cant even tell a difference between 6 bit with dithering and 8 bit

the thing is there more people with any type of color deficiency most definitely as you grow older and this doesn't mean your color blind but you likely perceive certain colors at less strength then those that see color perfectly i got crappy eyes when it comes to sight however my color perception is way better then my other family members

not all eyes are equal unfortunately
Última edición por Midnight Aurais; 7 SEP 2024 a las 5:00
ˢᵈˣ FatCat (Bloqueado) 7 SEP 2024 a las 5:13 
Publicado originalmente por Midnight Aurais:
10 bit is mostly for hdr and color accurate work i myself use a tv that runs at true 12 bit i can actually slightly like really slightly notice a difference between 8 bit and 12 bit when watching content and playing games in hdr

it is more if you have it lovely if you don't its not a huge susceptible difference

also if you have a display with 8 bit with dithering it emulates 10 bit enough were its very difficult to find the difference even while it is there

the majority of people won't see a improvement from 8 bit to 10 let alone the dithered version heck a lot of people cant even tell a difference between 6 bit with dithering and 8 bit

the thing is there more people with any type of color deficiency most definitely as you grow older and this doesn't mean your color blind but you likely perceive certain colors at less strength then those that see color perfectly i got crappy eyes when it comes to sight however my color perception is way better then my other family members

not all eyes are equal unfortunately
So you were saying, i just need 8bit on the setting, it will wont make difference with 10bit because my display almost reached true 10bit
My monitor claims have 98dcip3
Guydodge 7 SEP 2024 a las 7:28 
8 bit 16million colors - 10-bit 1.07 billion colors meaning a much smoother color.i run
10 bit and yes alot of newer games use 10 bit definitely worth running and if you want the best possible visual experience 10 bit is part of it.as color banding is far less likely in 10bit
if at all.all that said with 8bit w/FRC your good

https://www.benq.com/en-us/knowledge-center/knowledge/10-bit-vs-8-bit-frc-monitor-color-difference.html
https://steamcommunity.com/sharedfiles/filedetails/?id=3326423987
Última edición por Guydodge; 7 SEP 2024 a las 7:42
Crashed 7 SEP 2024 a las 7:31 
I did a quick lookup, it appears your panel supports 8 bit natively, but if you give it a 10 bit signal it will dither it to the 8-bit. So not only is there no disadvantage to setting to 10-bit but if you have 10-bit content it can smooth out banding artifacts.
Midnight Aurais 7 SEP 2024 a las 10:40 
Publicado originalmente por FatCat:
Publicado originalmente por Midnight Aurais:
10 bit is mostly for hdr and color accurate work i myself use a tv that runs at true 12 bit i can actually slightly like really slightly notice a difference between 8 bit and 12 bit when watching content and playing games in hdr

it is more if you have it lovely if you don't its not a huge susceptible difference

also if you have a display with 8 bit with dithering it emulates 10 bit enough were its very difficult to find the difference even while it is there

the majority of people won't see a improvement from 8 bit to 10 let alone the dithered version heck a lot of people cant even tell a difference between 6 bit with dithering and 8 bit

the thing is there more people with any type of color deficiency most definitely as you grow older and this doesn't mean your color blind but you likely perceive certain colors at less strength then those that see color perfectly i got crappy eyes when it comes to sight however my color perception is way better then my other family members

not all eyes are equal unfortunately
So you were saying, i just need 8bit on the setting, it will wont make difference with 10bit because my display almost reached true 10bit
My monitor claims have 98dcip3

it will make a difference if you can see it that is the problem here guydodge actually put up a link with one of the benefits but go try it yourself go to badly lit area's to exaggerate color banding then compare turning 8 vs 10 bit if you cannot spot it then you have your answer

once you go into true hdr color banding will be more defined mostly in the high brightness peaks so for hdr 8+dithering pref 10 bit or higher is better and there are special cases were the color wil look slightly off compared to the other color next to it if it is 8 bit had that already happen in cyberpunk 2077 but it is really nitpicky as it is just a slight tint difference

with the 8 bit dithering the only real problem i have seen around when you compare it to true 10 bit is that there is still slight banding problems with gray

so to keep it simple
sdr - minimum = 6 bit frc recommended = 8 bit to try if you see a difference = 8 bit frc
hdr - minimum = 8 bit frc recommended = 10 bit and higher the higher the better for hdr
Pocahawtness 7 SEP 2024 a las 10:58 
Publicado originalmente por FatCat:
So i have 8bit FCR monitor, and yet my amd driver can set up to 10bit display, and my windows detect my display is 10bit

So the question is : should i use 8bit or pick 10bit?
Can wrong setting display bit can make crash?

Let me explain. I will use my monitor as an example.

Your monitor can display 10bit color.

However, the interface between the computer and the monitor can sometimes be restricted to 8 bit. This is a bandwidth issue.

With my monitor, if I select 10bit then the refresh rate is limited to 120Hz or below.

However, 8bit+FRC uses a trick to display 10bit color while maintaining the lower bandwidth, so the refresh rate can be the higher limit of 144Hz instead of 120Hz.

What the PC does is to send the color information in alternate frames and the monitor uses that information to work out what the color should be. So it uses a color approximation of 10bit but is only using 8bit to do it but it spreads the information over two frames.

The color is accurate enough that most people can't tell the difference.

If you are an artist you might stick with 10bit but a gamer would go for 8bit+FRC.
Última edición por Pocahawtness; 7 SEP 2024 a las 10:59
ˢᵈˣ FatCat (Bloqueado) 7 SEP 2024 a las 18:50 
Publicado originalmente por Midnight Aurais:
Publicado originalmente por FatCat:
So you were saying, i just need 8bit on the setting, it will wont make difference with 10bit because my display almost reached true 10bit
My monitor claims have 98dcip3

it will make a difference if you can see it that is the problem here guydodge actually put up a link with one of the benefits but go try it yourself go to badly lit area's to exaggerate color banding then compare turning 8 vs 10 bit if you cannot spot it then you have your answer

once you go into true hdr color banding will be more defined mostly in the high brightness peaks so for hdr 8+dithering pref 10 bit or higher is better and there are special cases were the color wil look slightly off compared to the other color next to it if it is 8 bit had that already happen in cyberpunk 2077 but it is really nitpicky as it is just a slight tint difference

with the 8 bit dithering the only real problem i have seen around when you compare it to true 10 bit is that there is still slight banding problems with gray

so to keep it simple
sdr - minimum = 6 bit frc recommended = 8 bit to try if you see a difference = 8 bit frc
hdr - minimum = 8 bit frc recommended = 10 bit and higher the higher the better for hdr


Publicado originalmente por Pocahawtness:
Publicado originalmente por FatCat:
So i have 8bit FCR monitor, and yet my amd driver can set up to 10bit display, and my windows detect my display is 10bit

So the question is : should i use 8bit or pick 10bit?
Can wrong setting display bit can make crash?

Let me explain. I will use my monitor as an example.

Your monitor can display 10bit color.

However, the interface between the computer and the monitor can sometimes be restricted to 8 bit. This is a bandwidth issue.

With my monitor, if I select 10bit then the refresh rate is limited to 120Hz or below.

However, 8bit+FRC uses a trick to display 10bit color while maintaining the lower bandwidth, so the refresh rate can be the higher limit of 144Hz instead of 120Hz.

What the PC does is to send the color information in alternate frames and the monitor uses that information to work out what the color should be. So it uses a color approximation of 10bit but is only using 8bit to do it but it spreads the information over two frames.

The color is accurate enough that most people can't tell the difference.

If you are an artist you might stick with 10bit but a gamer would go for 8bit+FRC.
Now this is what i wanted to know detaily, you guys are awesome, rhx for all now i understand more
< >
Mostrando 1-11 de 11 comentarios
Por página: 1530 50

Publicado el: 6 SEP 2024 a las 7:24
Mensajes: 11