Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Oh my... you and another one ☝️🤣 ...
You forgot to mention QD stands for Quantum Dot LoL..
And n-O = NO, 8bit through 12-bit uses Chroma subsampling when using YCbcR bpc.
The higher the bit depth the higher the color space , NO?
Also, YCbcr444 is the best choice for HDR / Video content .
Ta Ta
You need at least HDR600 with FALD. And run Windows HDR calibration.
Where am i wrong exactly? educate me please. Don't tell me you think QHD = QD?
I'm not even talking about WOLED which are again inferior to QD-OLED. English is definitely not your first language because your rhetoric doesn't make any sense and you clearly weren't able to comprehend what I said or you have a fundamentally wrong understanding of monitor panels.
Seriously, ya'll educate yourself before spouting nonsense, QD = Quantom Dot, QHD = Quad High Definition.
Anything over 10 bit color depth is for content creators and not consumer. We don't have a consumer display that can display 12 bit only master displays use for calibration can go to anything over than 10 bit.
If your nVidia panel or display setting is showing you bit depth 12-bit, that doesn't mean that your display can actually show 12 bits.
again, educate yourselves.
Well props for you for admitting your mistake. You have my respect for that.
You forgot to mention IPS, in-plane switching .. 😏
ermm. yes and no..
Usually gpu cards in general reads the monitor correct by the signal it receives from the monitor .. The display actually sends a signal for the gpu to recognize compatibility for certain settings , no?
Unless the manufacturer from any device in fact (not just displays) → specifically ← mentioned otherwise, that what you read in windows is in fact right on the spot imo.
Please just educate yourself instead of trying to argue with people over something that you have no knowledge of.
https://www.reddit.com/r/OLED_Gaming/comments/16u6ogn/if_the_nvidia_control_panel_gives_me_a_12bit/
https://forums.guru3d.com/threads/what-about-12-bit-deep-color-in-pc-games.400372/page-3
Please .. READ MOMMIES LIP ... stick ... 🫦
So, mommy is wrong when i say , the higher the bit depth the higher the color space 🤔..
i.e: 8-bit = 16 million colors
10-bit = 1.000.000.000 billion colors
12-bit = whooping 68.000.000.000 billion colors 🤤
EDIT: By The By.. they are expensive, but 12-bit monitors actually do exist.. Fyi, Dolby Vision is mastered in 12-bit depth ..
You're being a ignoratio elenchi. You need to take the special school bus back to school kiddo.
Just LMAO on you breaking down 8 to 12 bit colors, like what is that suppose to prove? were we debating how many colors can each bit depth show? very bad at misdirection, you're also super cringe, no offense.
And you'll need to be teach some lessons in humility , when you address MOTHER ..
You readmommy.txt ( •̀ ω •́ ) ?
This is your only and FINAL warning..
NOw then... If a game properly supporst Dolby Vision .. yessszz.. "DV".. And you have an good display with HDR of 1000 Nits or close to that.. like my display.. it's like playing in whole other netherworld 😈..
k you got issues, nothing you say goes to backup you previous statements, just simply nonsense, honestly wondering what you on about. I hope you get better.
There you go brother from another mother Lewis 👏🤪