Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
The reverse is also true. There are games that have pretty much no lag at all even at 20FPS. Witcher 3 (next gen update) can be capped to 20FPS in the nvidia control panel and the latency feels exactly the same as 60. (RTSS doesn't work for this because for some reason, in this one game, its FPS limiter has much more input lag compared to Nvidia's.) I can clearly see it's 20FPS and therefore looks really choppy, but looking around with the mouse is extremely snappy and responsive.
Yeah. The math works out that way. I think that it's easier to detect such a small difference with a mouse (if the mouse is fast enough that is, like 1000Hz polling rate or higher.) Mouse look in an FPS or a third-person view game feels just that little bit less floaty with x4. I don't think moving the mouse in such a game has anything to do with reaction times. We're not reacting to anything. But we can see and feel the difference.
It's kind of like playing a game (without FG at all) at 120FPS@120Hz (8.33ms frame time) and then going to 240FPS@240Hz (4.17ms frame time.) That's also a difference of just only 4.17ms, but the difference (at least to me) in how responsive mouse look feels is quite obvious, even if I use motion blur to make both look the same (as a test only, I never use motion blur at high frame rates.) And I think most people can as well. Probably.
So I'd say it's no wonder that 60x4, even though it's only a 4.17ms improvement, can be felt as better. I don't think it's placebo.
So i did some quick testing in TTP2 at 55 base-fps and Gsync on, and i agree that there is a perceived responsiveness difference between x2 and x4 and FLM also consistently measures about >75% of the theoretical 4.17ms difference (~25ms vs ~28ms). But when i turn on motion-blur to vastly reduce the difference in smoothness between x2 an x4, the perceived difference in responsiveness becomes way less. About 80% if i had to give it a number. So i'm not entirely convinced that latency is that only factor in the perceived-responsiveness equation. Though it's cool to see the theoretical difference measures just about like you expect and is indeed present.
*The difference may even be a bit bigger at a fixed refresh rate, on most displays. As Gsync eliminates the difference in scanout-speed between 60 and 240.
**I should also note that when i set Scanline-sync from 1 to 2060 @50hz(4k-monitor), which measures a difference of about 9ms, i perceive less of a difference in responsiveness than from LS going from x2 to x4 at the same base-fps.
motion blur is terrible...makes it extremely hard to aim and gives a lot of people motion sickness. In reality there is no motion blur from moving your head.