Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
1.4 would only run it at 30Hz.
But you should opt for Displayport instead, it can do 12 bit color depth just fine, it can go up to 16 bit even.
He means HDMI 2.1 vs DP 1.4
DP is way ahead of HDMI so you don't need say DP 2.0 to do 4K at high refresh. With HDMI 2.0a I think it is, you can push up to 4K @ 120Hz with HDR disabled. Otherwise the Display, GPU and Cable needs to be HDMI 2.1
I don't really see anyone using 4K above 120Hz much yet
As for 8bit vs 10bit, again will not sacrifice VRF for it 4K@144hz > 4K@120hz, but overall 10bit is a better color shading system. https://www.arzopa.com/blogs/guide/8bit-vs-10bit-color
12bit is more for Cameras and TVs (12bit with TV using Dolby Vision)
It's not practical yet for watching movies, shows; playing games on PC yet.