SteamVR

SteamVR

deemon Nov 15, 2015 @ 7:06am
Does Vive have adaptive sync?
Does Vive (or Oculus) have adaptive sync (freesync or g-sync)?
< >
Showing 1-15 of 31 comments
The Maddog Nov 15, 2015 @ 7:23am 
It's not built into Oculus CV1 thats for sure (more on that in a moment). I'm not even sure it's a good thing for VR. Adaptive sync helps slower computers that can't keep up with the display. That is a contradiction to VR requirements (at least in the case of the Oculus Rift) which require fast computers capable of maintaining high FPS.

Going back to why I said it's not on the CV1, Oculus is going with time warp which is a far better option right now. This video does a good job of explaining why.

https://www.youtube.com/watch?v=WvtEXMlQQtI


In the case of the Vive, I originally heard Valve and HTC opted out of using time warp (to use their own alternative) but since they included time warp in a recent SteamVR patch they may have changed their minds so I'm inclinde to say the Vive wont use adaptive sync either.
Last edited by The Maddog; Nov 15, 2015 @ 7:29am
melethron Nov 15, 2015 @ 7:24am 
No but for a good reason. You ALWAYS (!!!) need to render at the maximum refreshrate of the pannels which is 90Hz. Syncing the Refreshrate to lower framerates would be useless as you dont want lower framerates and refeshrates. Also lower refreshrates would cause issues with the low percistences displays. One can even notice flickering at 60Hz low persistence so refreshing at 90Hz ore more is needed.

Although the refreshrate is fixed at least the Vive uses a global display, which means that all pixel are light up at the same time (and just for 2ms for low percistence).

In addition the HMD could use reprojection (or timewarp) to fill missed frames with a slightly modified image of the last frame based on the new sensor readouts.

TL;DR Forget G-Sync for VR. Doesn't make any sense.
melethron Nov 15, 2015 @ 7:29am 
Maddog. If you miss frames i aggree that timewarp is the better option. Also if you can only render at 60fps and want to use 120Hz panel (like PlaystationVR) reprojection/timewarp is a good option.

But if you dont miss frames AND the whole pipeline can stay under 20 ms latency you dont need timewarp (20ms is the time for the neurons in your eye need to gather photons and transmit the signal to your brain -> under 20ms = your brain dont notice being fooled). For this reason Valve/HTC doesnt use it.
deemon Nov 15, 2015 @ 7:33am 
Even at 90Hz it still benefits, if the frame transfer and display at the VR screen will start immiadetly and in sync when the frame is actually ready. It's still valid thing for smaller picture latency and jitters.

I mean even when computer manages 90Hz and screen shows 90Hz... doesn't mean they are in sync with eachother and adaptive screen sync helps.

And I couldn't care less about g-sync...(because it would increase the price of the headset by 100-150€?, no thanks) adaptive sync however I would like to have there as it doesn't increase the unit price.

And looking this Time warp video, it pretty much tries to do the same thing adaptive sync does, but by 2D adjusting (and distorting) the picture to better position...I believe I wouldnt be too much off saying Time warp tech tries to be software based adaptive sync, possibly causing artifacts (disocclusion) and motion blur in the process. I'd rather take hardware based adaptive sync anyday.
Last edited by deemon; Nov 15, 2015 @ 7:59am
melethron Nov 15, 2015 @ 7:49am 
Not true. If the frame is ready earlier the display would need higher refreshrate than it actually has (not working) and if the frame is ready to late and the display waits you would experience judder and flickering (bad VR software).

Also if you to refer not to using any prerendered frames to get the latest frame to the display and dont buffer them. Thats obviously true and is used by default. This option was available in driver settings for quite some time but since VR its called "Virtual Reality pre-rendered frames" at Nvidias system panel. The only reason why 1 wasn't defaulted was to trade fps for latency.
deemon Nov 15, 2015 @ 7:54am 
1 was defaulted for VR pre-rendered frames. at least in all my nvidia systems.
melethron Nov 15, 2015 @ 7:56am 
Originally posted by deemon:
And looking this Time warp video, it pretty much tries to do the same thing adaptive sync does, but by 2D adjusting (and distorting) the picture to better position... possibly causing artifacts and motion blur in the process. I'd rather take adaptive sync anyday.

Those are low persitence Displays. There is not motion blur. Low persistence and the small display size (and thus pixel pitch) also makes it hard to get higher refreshrates. Also the global displays avoids tearing from head movement (thats something thats not even an issue for a regular monitor).

Display tech for VR is a lot different than it is for a regular monitor.

Once its possible to make panels in this size with low persistence and global pixels with refrehs rates above 120hz AND we have the GPUs to render VR at fps > 120 one could think about adaptive Sync. Until then you always want to hit maximum(!) refreshrate and make sure your render pipeline keeps up to that refeshrate.
melethron Nov 15, 2015 @ 8:01am 
Originally posted by deemon:
1 was defaulted for VR pre-rendered frames. at least in all my nvidia systems.

Yes. I meant BEFORE VR this option was also available. I was called just "pre-rendered frames" without VR and back than wasn't set to 1 by default.

That was at the time when fps sold people on gpus and all but the pro gamers didnt really care about latency (prerender frames causes input lags etc). With VR latency gets more important and now its called "virtual reality pre-rendered frames" then set to the value that it always should have been and sold as a new feature. ;-)
deemon Nov 15, 2015 @ 8:01am 
Originally posted by DFin:
Originally posted by deemon:
And looking this Time warp video, it pretty much tries to do the same thing adaptive sync does, but by 2D adjusting (and distorting) the picture to better position... possibly causing artifacts and motion blur in the process. I'd rather take adaptive sync anyday.

Those are low persitence Displays. There is not motion blur. Low persistence and the small display size (and thus pixel pitch) also makes it hard to get higher refreshrates. Also the global displays avoids tearing from head movement (thats something thats not even an issue for a regular monitor).

Display tech for VR is a lot different than it is for a regular monitor.

Once its possible to make panels in this size with low persistence and global pixels with refrehs rates above 120hz AND we have the GPUs to render VR at fps > 120 one could think about adaptive Sync. Until then you always want to hit maximum(!) refreshrate and make sure your render pipeline keeps up to that refeshrate.


I don't talk about screen motion blur, but about picture motion blur (you know the thing if you turn on or off motion blur in your game settings) ... it has nothing to do with screens.
Last edited by deemon; Nov 15, 2015 @ 8:01am
deemon Nov 15, 2015 @ 8:05am 
Originally posted by DFin:
Originally posted by deemon:
1 was defaulted for VR pre-rendered frames. at least in all my nvidia systems.

Yes. I meant BEFORE VR this option was also available. I was called just "pre-rendered frames" without VR and back than wasn't set to 1 by default.

That was at the time when fps sold people on gpus and all but the pro gamers didnt really care about latency (prerender frames causes input lags etc). With VR latency gets more important and now its called "virtual reality pre-rendered frames" then set to the value that it always should have been and sold as a new feature. ;-)

well I have set this setting (before VR, the old "maximum pre-rendered frames") to 1 since day one I ever got my first nvidia GPU that you was able to alter the setting (with 3rd party tool (RivaTuner?), before nvidia gave us nvidia control panel).
Last edited by deemon; Nov 15, 2015 @ 8:13am
melethron Nov 15, 2015 @ 8:11am 
Originally posted by deemon:
I don't talk about screen motion blur, but about picture motion blur (you know the thing if you turn on or off motion blur in your game settings) ... it has nothing to do with screens.

Well VR without(!) Low percistence Displays has issues with motion blur. There is an in depth article from michael abrash when he still worked on SteamVR at Valve (now he works at Oculus) about motion blur/smearing: http://blogs.valvesoftware.com/abrash/why-virtual-isnt-real-to-your-brain-judder/

Apart from the real motionblur:

1. You dont do motion blur post processing in VR. It's just bad.
2. Reprojection (aka Timewarp) is independent from the rendered frame. The Frame is rendered to a render target and reprojection is done afterwards and dont affect any postprocessing effects (Edit: Its actually Post-post-prrocessing ;-) ). PlaystationVR even uses an external device (with a GPU) for the reprojection.
Last edited by melethron; Nov 15, 2015 @ 8:13am
deemon Nov 15, 2015 @ 8:31am 
Anyway, I still think VR would benefit greatly from adaptive sync. Since you don't want screen tearing in VR either, do you? Also FreeSync has a dynamic refresh rate range of 9–240 Hz ( https://en.wikipedia.org/wiki/FreeSync ), so the Vive' 90Hz is nicely within the range. Also on case at 90Hz it wouldnt matter, why we have then 144Hz monitors with freesync? Because it does matter.
melethron Nov 15, 2015 @ 9:03am 
Originally posted by deemon:
Anyway, I still think VR would benefit greatly from adaptive sync. Since you don't want screen tearing in VR either, do you? Also FreeSync has a dynamic refresh rate range of 9–240 Hz ( https://en.wikipedia.org/wiki/FreeSync ), so the Vive' 90Hz is nicely within the range. Also on case at 90Hz it wouldnt matter, why we have then 144Hz monitors with freesync? Because it does matter.

You dont just need 9-240Hz freesync but also a display that can handle the refresh rates. It's easier to build high refreshrate panels for monitors than smaller panels for hmds/smartphones. Apart from that those panels are OLED and not LCD panels because only OLEDs can have pixel switch times of 2ms for the needed low persistence.

As I said it would make sense once we have much higher refreshrates on those devices (maybe adaptive sync between 90-240Hz), but there are the current limits on tech.

Going down to 60Hz or less on a monitor is fine but its not on VR. On my GearVR the 60 Hz flicker is constantly annoying me and cant use it as long as I am able to use my Vive Dev Kit.

Current VR panels are state of the art and the refreshrates are the maximum thats possible and the minimum that is needed for an amazing experience. There is no room to "adapt" between 90 Hz minmum requirement and 90Hz maximum capability.
deemon Nov 15, 2015 @ 9:24am 
Originally posted by ;490125737473609546:
There is no room to "adapt" between 90 Hz minmum requirement and 90Hz maximum capability.

and this is where you are wrong. It's like saying if you have constant 60 or 90Hz, there is no screen tearing. ever. or stutter when using vsync. ou wait... if that would be the case why the hell did we need g-sync and free-sync in the first place? duh.
Last edited by deemon; Nov 15, 2015 @ 9:41am
The Maddog Nov 15, 2015 @ 9:24am 
Originally posted by deemon:
On my GearVR the 60 Hz flicker is constantly annoying me and cant use it as long as I am able to use my Vive Dev Kit.

What spec computer are you using with the Vive and how long are you able to use the Vive before needing a break?
Last edited by The Maddog; Nov 15, 2015 @ 9:25am
< >
Showing 1-15 of 31 comments
Per page: 1530 50

Date Posted: Nov 15, 2015 @ 7:06am
Posts: 31