Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
It shouldn't cause crash, but can cause output of slightly incorrect colors.
…like those 720p HDReady tvs from a few years back. You can give them a 1080p image and the tv does some processing before displaying at 720p.
Anyway thank you
it is more if you have it lovely if you don't its not a huge susceptible difference
also if you have a display with 8 bit with dithering it emulates 10 bit enough were its very difficult to find the difference even while it is there
the majority of people won't see a improvement from 8 bit to 10 let alone the dithered version heck a lot of people cant even tell a difference between 6 bit with dithering and 8 bit
the thing is there more people with any type of color deficiency most definitely as you grow older and this doesn't mean your color blind but you likely perceive certain colors at less strength then those that see color perfectly i got crappy eyes when it comes to sight however my color perception is way better then my other family members
not all eyes are equal unfortunately
My monitor claims have 98dcip3
10 bit and yes alot of newer games use 10 bit definitely worth running and if you want the best possible visual experience 10 bit is part of it.as color banding is far less likely in 10bit
if at all.all that said with 8bit w/FRC your good
https://www.benq.com/en-us/knowledge-center/knowledge/10-bit-vs-8-bit-frc-monitor-color-difference.html
https://steamcommunity.com/sharedfiles/filedetails/?id=3326423987
it will make a difference if you can see it that is the problem here guydodge actually put up a link with one of the benefits but go try it yourself go to badly lit area's to exaggerate color banding then compare turning 8 vs 10 bit if you cannot spot it then you have your answer
once you go into true hdr color banding will be more defined mostly in the high brightness peaks so for hdr 8+dithering pref 10 bit or higher is better and there are special cases were the color wil look slightly off compared to the other color next to it if it is 8 bit had that already happen in cyberpunk 2077 but it is really nitpicky as it is just a slight tint difference
with the 8 bit dithering the only real problem i have seen around when you compare it to true 10 bit is that there is still slight banding problems with gray
so to keep it simple
sdr - minimum = 6 bit frc recommended = 8 bit to try if you see a difference = 8 bit frc
hdr - minimum = 8 bit frc recommended = 10 bit and higher the higher the better for hdr
Let me explain. I will use my monitor as an example.
Your monitor can display 10bit color.
However, the interface between the computer and the monitor can sometimes be restricted to 8 bit. This is a bandwidth issue.
With my monitor, if I select 10bit then the refresh rate is limited to 120Hz or below.
However, 8bit+FRC uses a trick to display 10bit color while maintaining the lower bandwidth, so the refresh rate can be the higher limit of 144Hz instead of 120Hz.
What the PC does is to send the color information in alternate frames and the monitor uses that information to work out what the color should be. So it uses a color approximation of 10bit but is only using 8bit to do it but it spreads the information over two frames.
The color is accurate enough that most people can't tell the difference.
If you are an artist you might stick with 10bit but a gamer would go for 8bit+FRC.
Now this is what i wanted to know detaily, you guys are awesome, rhx for all now i understand more