Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Going back to why I said it's not on the CV1, Oculus is going with time warp which is a far better option right now. This video does a good job of explaining why.
https://www.youtube.com/watch?v=WvtEXMlQQtI
In the case of the Vive, I originally heard Valve and HTC opted out of using time warp (to use their own alternative) but since they included time warp in a recent SteamVR patch they may have changed their minds so I'm inclinde to say the Vive wont use adaptive sync either.
Although the refreshrate is fixed at least the Vive uses a global display, which means that all pixel are light up at the same time (and just for 2ms for low percistence).
In addition the HMD could use reprojection (or timewarp) to fill missed frames with a slightly modified image of the last frame based on the new sensor readouts.
TL;DR Forget G-Sync for VR. Doesn't make any sense.
But if you dont miss frames AND the whole pipeline can stay under 20 ms latency you dont need timewarp (20ms is the time for the neurons in your eye need to gather photons and transmit the signal to your brain -> under 20ms = your brain dont notice being fooled). For this reason Valve/HTC doesnt use it.
I mean even when computer manages 90Hz and screen shows 90Hz... doesn't mean they are in sync with eachother and adaptive screen sync helps.
And I couldn't care less about g-sync...(because it would increase the price of the headset by 100-150€?, no thanks) adaptive sync however I would like to have there as it doesn't increase the unit price.
And looking this Time warp video, it pretty much tries to do the same thing adaptive sync does, but by 2D adjusting (and distorting) the picture to better position...I believe I wouldnt be too much off saying Time warp tech tries to be software based adaptive sync, possibly causing artifacts (disocclusion) and motion blur in the process. I'd rather take hardware based adaptive sync anyday.
Also if you to refer not to using any prerendered frames to get the latest frame to the display and dont buffer them. Thats obviously true and is used by default. This option was available in driver settings for quite some time but since VR its called "Virtual Reality pre-rendered frames" at Nvidias system panel. The only reason why 1 wasn't defaulted was to trade fps for latency.
Those are low persitence Displays. There is not motion blur. Low persistence and the small display size (and thus pixel pitch) also makes it hard to get higher refreshrates. Also the global displays avoids tearing from head movement (thats something thats not even an issue for a regular monitor).
Display tech for VR is a lot different than it is for a regular monitor.
Once its possible to make panels in this size with low persistence and global pixels with refrehs rates above 120hz AND we have the GPUs to render VR at fps > 120 one could think about adaptive Sync. Until then you always want to hit maximum(!) refreshrate and make sure your render pipeline keeps up to that refeshrate.
Yes. I meant BEFORE VR this option was also available. I was called just "pre-rendered frames" without VR and back than wasn't set to 1 by default.
That was at the time when fps sold people on gpus and all but the pro gamers didnt really care about latency (prerender frames causes input lags etc). With VR latency gets more important and now its called "virtual reality pre-rendered frames" then set to the value that it always should have been and sold as a new feature. ;-)
I don't talk about screen motion blur, but about picture motion blur (you know the thing if you turn on or off motion blur in your game settings) ... it has nothing to do with screens.
well I have set this setting (before VR, the old "maximum pre-rendered frames") to 1 since day one I ever got my first nvidia GPU that you was able to alter the setting (with 3rd party tool (RivaTuner?), before nvidia gave us nvidia control panel).
Well VR without(!) Low percistence Displays has issues with motion blur. There is an in depth article from michael abrash when he still worked on SteamVR at Valve (now he works at Oculus) about motion blur/smearing: http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/
Apart from the real motionblur:
1. You dont do motion blur post processing in VR. It's just bad.
2. Reprojection (aka Timewarp) is independent from the rendered frame. The Frame is rendered to a render target and reprojection is done afterwards and dont affect any postprocessing effects (Edit: Its actually Post-post-prrocessing ;-) ). PlaystationVR even uses an external device (with a GPU) for the reprojection.
You dont just need 9-240Hz freesync but also a display that can handle the refresh rates. It's easier to build high refreshrate panels for monitors than smaller panels for hmds/smartphones. Apart from that those panels are OLED and not LCD panels because only OLEDs can have pixel switch times of 2ms for the needed low persistence.
As I said it would make sense once we have much higher refreshrates on those devices (maybe adaptive sync between 90-240Hz), but there are the current limits on tech.
Going down to 60Hz or less on a monitor is fine but its not on VR. On my GearVR the 60 Hz flicker is constantly annoying me and cant use it as long as I am able to use my Vive Dev Kit.
Current VR panels are state of the art and the refreshrates are the maximum thats possible and the minimum that is needed for an amazing experience. There is no room to "adapt" between 90 Hz minmum requirement and 90Hz maximum capability.
and this is where you are wrong. It's like saying if you have constant 60 or 90Hz, there is no screen tearing. ever. or stutter when using vsync. ou wait... if that would be the case why the hell did we need g-sync and free-sync in the first place? duh.
What spec computer are you using with the Vive and how long are you able to use the Vive before needing a break?