Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
-JLarja / Frozenbyte
It doesnt trigger in this game. Not sure why.
These games sing in HDR with such colourful scenes. I'm not sure why you haven't implemented HDR naively. But if you can find out why Trine 4 triggers Windows 11 Auto HDR and Trine 5 doesn't (and get it working) many people will get to see this game in its best and brightest light.
For now I have had success using Special K to enable HDR but its certainly more of a faff than letting Windows just enable it when the game launches.
Also this Reddit thread may shed some light on why it won't trigger. Something to do with a "swapchain" or "DXGI flip model".
https://www.reddit.com/r/Windows11/comments/1123osi/force_autohdr_on_unsupported_game/
https://www.pcgamingwiki.com/wiki/Glossary:High_dynamic_range_(HDR)#Enable_HDR_output_in_unsupported_games
I'm not sure why it doesn't work in Trine 5. As far as I can tell from the linked pages (and some links from them), it should. I'm not a graphics programmer, and I didn't go searching for Microsoft's articles about Auto HDR, so I could be missing something obvious.
In addition to high-end GPUs and consoles capable of actual HDR rendering, we target low-end GPUs, PS4, Xbox One and Nintendo Switch. To fully support HDR would require much more work, as we couldn't support just HDR. If we did (and downscaled on platforms not capable enough), the end result would be worse for given GPU power. Only people with high-end hardware would benefit.
Proper HDR support would also require new assets done with HDR in mind (we recycle lot of stuff from game to game), and obviously getting HDR monitors and better GPUs to artists. Hardware is not a huge expense, but it's needed anyway.
We could obvious fake HDR the same way Auto HDR and Special K do, but it's highly subjective whether the end result is better with or without it. I haven't checked myself, but if it's anything like what TVs do, I'm happy to skip it. Level artists, who spent hours setting up the lighting and mood of the scenes probably aren't very fond of the idea that after all that work, some automatic system comes along and messes everything up :D .
-JLarja / Frozenbyte
Thank you for taking the time to share your rationale. Hopefully the windows AutoHDR function works in the future for this game.
But playing Trine 5 back in SDR, I do want to say how gorgeous it looks and clear care has gone into the very detailed artwork. I am plying local coop with my wife and she fell in love with the gardens and roses around the palace. Every new area she remarks on how beautiful it all looks and it really is!
https://youtu.be/INLr8hCgP20?si=VvECxA2WcW6CBlrC
If the game has native HDR turned on, AutoHDR will turn off. Not sure if Trine 5 is saying it supports native HDR or is configured to run with 10- or 12-bit color.
I run Windows in 12-bit color, Full (not Limited), and with HDR enabled on my LG G2 OLED. None of the Trine games support HDR nor Special K; although, AutoHDR did trigger when opening Trine 4. Using Special K, the engine crashes before the game loads. I think it doesn't like Special K's DLLs being injected in there.
I think I disagree with other posters though. AutoHDR never looked good with Trine 4. I think it was better than SDR (which can look washed out or dim), but there was clipping especially in the stars behind the window since they're white and AutoHDR doesn't know how to scale subtle gradients (close to white). And AutoHDR makes colors look over-saturated in general.
Special K's HDR is much closer to native. Games will look better than AutoHDR because it hooks into lower-level parts of DX11's pipeline to return the full 16-bit color values and then tone maps them down to 10-bit or 12-bit for our displays relative to the peak nits you've specified. That's not the case for DX12 or Vulkan though, but I think its sliders can correct many issues (like clipping) that you'll see with AutoHDR.
I tried running Trine 5 with DXVK (DirectX -> Vulkan), but Trine 5 doesn't load as Vulkan, only DX 11.4.
One suggestion from the Special K folks was to disable Steam Input. That didn't fix it either.
So at this point, I can't find a way of enabling HDR with Trine 5 making the game look very washed out and dim.
I think there's a bug with Trine 5 in Windows. I keep HDR on all the time in Windows, but when closing Trine 5, even though HDR is enabled, Windows becomes very dim. I have to disabled, then re-enable HDR to get the brightness back to normal. This might be an issue with the latest version of Windows though as my TV shows the HDR logo when turning off HDR after closing Trine 5. And then shows it again when re-enabling HDR.
NOTE: If you're using AutoHDR, you should make sure your display is calibrated in Windows using the official Windows HDR Calibration tool that you need to download from the Microsoft Store. Without that, Windows may use 1499 nits as the peak rather than your display's value (mine is 990 nits, but it doesn't report that to Windows).
I'm a developer as well, and as I was in charge of HDR implementation in recent major title, I see a few points you mentioned that aren't entirely correct.
HDR does not require any additional GPU power, so it's actually completely compatible with targeting lower end hardware. The end result would not be worse for given GPU power, by which I mean that game would look and run the same as it currently does in SDR. And anyone with an HDR display and a GPU that can connect to such a display would benefit. For example, this would be any Nvidia GPU released in the last 10 years.
I was the only one with an HDR monitor in our development team of 40 people, and we shipped with HDR. So while it's beneficial for more people to understand and use HDR during development, it's actually not necessary at all. Surely you already have at least one GPU released within last 10 years, so only hardware investment that would be absolutely necessary would be a single HDR monitor, although I would recommend two, one for graphics programmer and one for a single technical artist.
It's possible that much more work would be required on your end to fully and properly support HDR, depending on the current state of your engine, but it's not as much as you seem to think, as you wouldn't need any new assets at all, just some graphics programming and testing.
I play Trine through SpecialK and it's definitely worth it. The cosmetic "enhancements" are optional and I do not use them. The main point of it is to switch rendering from 8 bit to 10 or 12 bit, which eliminates banding artifacts that become visible at higher screen brightness. Aesthetically, the game then looks the same as it would in SDR on a very bright SDR screen, minus the banding artifacts.
And just to clarify for anyone that is unaware for how game engines work, internally game engines are already HDR, typically 16 bit, which then as a final step gets compressed down to 8 bit to make it deliverable to an SDR monitor. This is what makes the increase from 8 to 10 or 12 bit as a hack possible, and why it isn't any more expensive, since internally game will now be compressing the data from 16 bit to 10 or 12 bit, so the result is just that less of the data that is already there is being thrown away. This is why SDR to HDR conversion of this type is possible and effective for SDR games, and impossible for SDR movies for example.
The thing that would take the HDR experience 95% of the way there would be the ability to reduce exposure, as this would counteract the monitor being very bright, game would go back to normal brightness, with highlights going beyond that, which is the main point of HDR (much less clipping and compression of highlights)
Implementing HDR properly, would be doing this same thing, but not doing it as a hack, which gives you the ability to map colors to a wider color space when HDR is enabled. Colors that would get highlight-compressed or clipped in SDR by SDR tone curve, would suddenly be able to "breathe" when compressed by a less aggressive HDR tone curve, they'd look natural and beautiful, without being over saturated. And if you have local tone mapping feature in your engine and want to go the extra mile, then you can adjust that to be less aggressive in HDR. For our game, it was as simple as making it half as strong.
And that's it really.
Also, I don't think our graphics pipeline is quite as simple as you describe above. Especially on Switch various passes are done using different resolutions and bit depths. We don't just render everything in HDR and then resample in SDR. If it was that simple, we could obviously add a checkbox to options to enable HDR. Switch, by the way, isn't capable of outputting HDR at all (GPU isn't an issue, just like you said).
Please note that I'm probably the last person anyone should ask about looks of anything in any game. I'm known for not noticing stuff like missing shadows afterall. I'm just telling here what I've been told. It's obviously possible to _support_ HDR with very little effort, but how much effort is needed to make end result better than the highly optimized SDR version we have now, is subjective.
[1] I have once participated in a project on which the publisher very strongly wished for HDR support. We had two TVs and one monitor. We couldn't get the (Unreal 4 based) game to look good on all of them, and one could argue whether it looked better than SDR version even on the best of them.
-JLarja / Frozenbyte
In the meantime I spent years learning about HDR, using it in video production and tinkering with it endlessly, and finally ended up handling its implementation in the context of UE5.
Problem with crappy HDR monitors is that they use cheap backlight dimming implementation, which causes raised shadows and leads to washed out look. This creates an opportunity to provide and present HDR calibration to the users, so that they can at least somewhat compensate for the raised shadows. But taking advantage of this does require a bit of extra development time.
Targeting the crappy monitors by default is something we tried to do in our previous project as well, and it was a bad idea. Not only is this in principle an unfortunate compromise at best, but it would make the game, or any HDR media, age badly as quality of monitors kept increasing with time.
Good news is that multiple monitors or TVs aren't necessary, or even desirable. All that is needed is one truly accurate display that precisely adheres to the standard, and a colorimeter to confirm that it does and knowledge to correct it further if it doesn't.
HDR standard is even more clearly and precisely defined than the SDR standard. There is only one correct way to interpret HDR signal for the purpose of content creation, and it's specified by ITU Report BT.2390 on page 9 (https://www.itu.int/pub/R-REP-BT.2390) Ideally, we would have access to a true HDR reference monitor, but since we can't afford that, we settle for the best consumer display and a colorimeter, in my case I went with $3000 Sony A95K QD-OLED, and further calibrated it myself to ensure its spot-on accurate. After you have a makeshift reference monitor, you strictly use that, and nothing else short of $30 000 actual HDR reference monitor somehow descending from the heavens.
I pretty much agree with this (though I don't have fresh experience with what's on the market these days), but we seem to come to different conclusion. I think it's better to ignore HDR until the teething issues are resolved (that is, the crappy HDR monitors have reached a level where the crappiness isn't that much of a problem anymore).
If manufacturers adhered to standards, we wouldn't be having this conversation. At this point, it's irrelevant how well or badly SDR standards are defined, as we have de facto "how things work" standards that are close enough on the market. Anyway, I don't think the lack of reference monitor with superb color reproduction is a particular problem for us. The fact that something like that is necessary useful, and we can't just get by with any sub 1000 € TV is.
-JLarja / Frozenbyte
My main concern are the errors preventing me from using Special K with Trine 5 (and other Trine games).
If you add native support for HDR is up to you, but I'd at least like a way to hack it in myself.
There are very few games of my games that don't work with Special K.