Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
I know everyone said this but not long ago i got my first hdr ever monitor.
34GN950-G 3440x1440 180hz gsync ultimate hdr600 and i love it in games like bf2042. It looks way better then without.
Same analogy with people who come from @4K or @144hz to 1080P and 60hz.
Once you use a true HDR display like HDR10/HDR10+ or HDR1000 (1000-4000 nits), youll see the huge difference....Its literally insane.
Not only does the "black" become "greyish" (especially on IPS panels) with HDR400/600, the contrast will also decrease.
With other words,its "fake HDR" within the SDR-range. The loss on the black/dark level cant be compensated, since HDR400/[600] does not call for local dimming.
The 34GN950 is a decent piece of hardware, no doubt about it, and because of its 450cd/m² HDR wont look as awful/washed out as on 300-400cd/m² displays but its still not the real deal.
I do own the same monitor and as soon as I tried out the HDR option on RDR2 or the Flight Simulator, I instantly turned it off.
I also used a colorimeter (SpyderX) for calibration and it did noteven work properly on HDR600, because the colors are not accurate.
I dont know which Gain/offset settings youre currently using on your 34GN950, but I cant confirm that HDR400/[600] looks better on this monitor. Also never heard that someone would claim this, but each his own, I guess ?! :)
edit : [ ]
Yesterday I bought a colorimeter (Spyder X) to calibrate my new monitor when it arrives. I used it now to calibrate my old monitor (I payed a guy to do this for me a long time ago) and, WOW, what a difference. My white point was completely off...
in a monitor i hate oversaturated colors.but when used like a shader it gives a realism
i absolutely use 100% if avaliable.dont write it off play around with it.and used as plain
hdr its not the greatest but its not bad either can look pretty damn good if you like HDR.
and colors being washed out is because your trying to use it on non HDR content
just because your monitor has HDR doesnt mean everything on your monitor will be in HDR.
thats not how it works.including your desktop and 95% of all vids.also some games you
have to manually turn on in windows some you dont.theres a very small learning curve to it.
Instead of using HDR400 as a "shader filter" while decreasing color saturation/range of colors,
just increase gamma and youll get the same effect.
Like already mentioned, this isnt HDR how its meant to be and cant be called as such.
You can get the exact same effect if you put gamma on max and reduce contrast. Just try it out. HDR400/600 on SDR monitors does the exact same thing and thats why everybody recommends to NOT use it.
Thank you for explaining how HDR works, my friend. I would have never figured this out.
So Cyberpunk, RDR2, FlightSim20, etc. do not support HDR ?
I guess at this point, there is not much sense into diving deeper into a discussion.
Just enjoy your HDR400. Whatever works best for you. ^^
Nothing wrong if you like it but keep in mind, that youre decreasing the color range on your monitor.
https://www.youtube.com/watch?v=0srgPxK-WhQ
----------------------------------------------------
edit :
Absolutely, it makes a huge difference. Its funny how we used our monitors as kids and never bothered about such things and everything looked amazing.
But as soon as you grow older and you put value on image quality and use your monitor for work related projects,
color accuracy becomes more and more essential and you start to realize, that youve used wrong color settings for over a decade.....