安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
https://support.microsoft.com/en-us/windows/calibrate-your-hdr-display-using-the-windows-hdr-calibration-app-f30f4809-3369-43e4-9b02-9eabebd23f19
I do not know what exactly a game needs to support Win11's AutoHDR, but Skyrim uses 10-bit colour channels internally, and by setting the above mentioned flag bUse64bitsHDRRenderTarget=1 will the engine use 16 bits per colour channel. I assume games need to support at least the increased colour depth to be able make use of AutoHDR. The default light shaders of Skyrim are SDR, but there is a mod to replace them with true HDR shaders (https://www.nexusmods.com/skyrimspecialedition/mods/76521). And of course there are ENBs, which further make use of the increased colour depth. So the support of HDR within Skyrim is not as fake as some want you to believe. It is not perfect, but it is Skyrim we are talking about. One can toggle Skyrim's HDR support with the console command thdr on and off, and thus see directly the impact of it.
By the way, a title, which officially claims to have support for HDR and/or DolbyVision can be doing just the same as Skyrim, because what happens inside a game engine and inside the PR department are two different worlds.
That setting affects shader accuracy, it can let shaders process more accurate numbers in an increased color range that HDR would in-theory benefit from, but does not actually enable HDR output for the game. It'll still be outputting an SDR image to your monitor.
You can get a pretty good effect using that combined with HDR rendering solutions like the VanillaHDR mod or ENBs, but those will still get crushed to an SDR range before it reaches your monitor. Internal HDR rendering is different from actually sending an HDR output to your display. The only way to get proper HDR display output in Skyrim is using Windows 11's autoHDR or fiddling around with the Special K HDR injector.
But all that's pedantic details. As long as you're loving your image and enjoying the game, it doesn't matter at all how you got there or what's 'true HDR' or not.
It needs Windows (DirectX and graphics drivers) to pass on the HDR information to your monitor. This is why it needs AutoHDR, or else will Windows not pass it on, but instead flatten the image to SDR.
Even using the wrong port on a graphics card or the wrong cable can cause a loss of the HDR information. So does my GTX960 only produce an HDR signal on its one HDMI port, but not on the three DisplayPorts. It needs the right cables (i.e. HDMI 2.1, DisplayPort 1.4), because older cables will not have the bandwidth to carry the information.
This is not Skyrim's fault. It has HDR support and produces an HDR image. It is the fault of Windows and/or the hardware. Skyrim on consoles has working HDR output for a while where all the components work together as they should.
It is the same ancient problem with games on PC, where sometimes resolutions and refresh rates get ignored, and HDR is certainly not immune to it. Rather expect people to have plenty of problems with it, and of course, some will blame it on the game.