Turok 2: Seeds of Evil

Turok 2: Seeds of Evil

View Stats:
1337Dude Dec 1, 2017 @ 6:41pm
OpenGL + Vsync + Triple Buffering = V-Sync W/o Input Lag?
Has anyone tried this? Theoretically triple buffering in OpenGL uses the additional buffer to create a more responsive V-Sync experience. This is different than triple buffering in Direct3D (DirectX) which inserts an additional frame of input lag as a sacrifice in order to allow <60FPS to work without being displayed as a 30FPS. Triple buffering is very memory intensive, if I recall correctly (uses 1.5x as much VRAM).

Basically, a proper combination of settings should allow an experience comparable to G-Sync or FreeSync, but without the hardware requirements. This is optimal in situations where you're gaming on a large screen and have a lot of extra FPS, like if you're playing Turok 2 on a 65" 4k TV.

Last edited by 1337Dude; Dec 1, 2017 @ 6:41pm
< >
Showing 1-2 of 2 comments
devon_rex_unreal Dec 2, 2017 @ 11:32am 
Intresting.

Gonna try this out when i feel like it and see how it plays out,
1337Dude Dec 2, 2017 @ 6:05pm 
I was testing and I don't think there's any way I can really figure it out for sure without a measuring tool. My 65" OLED B6 has 20.5ms input lag so it's pretty responsive even with an extra frame of input lag. Having said that, it feels incredibly responsive with triple buffering in OpenGL, and V-Sync tanks me in other games in terms of input lag.

I've put a few hours in this game and the screen tearing has been excessive (with the tearing occuring in the middle of the screen) so I think I'll be using V-Sync even if the triple buffering works as theorized or not.

Here's the article[www.anandtech.com] documenting what this thread is discussing. Here are the important bits

No artificial delay
The software is still drawing the entire time behind the scenes on the two back buffers when triple buffering. This means that when the front buffer swap happens, unlike with double buffering and vsync, we don't have artificial delay. And unlike with double buffering without vsync, once we start sending a fully rendered frame to the monitor, we don't switch to another frame in the middle.

Delay dependent on when tearing specifically occurs, but still not significant compared to regular v-sync
This last point does bring to bear the one issue with triple buffering. A frame that completes just a tiny bit after the refresh, when double buffering without vsync, will tear near the top and the rest of the frame would carry a bit less lag for most of that refresh than triple buffering which would have to finish drawing the frame it had already started. Even in this case, though, at least part of the frame will be the exact same between the double buffered and triple buffered output and the delay won't be significant, nor will it have any carryover impact on future frames like enabling vsync on double buffering does.

DirectX only supports render ahead "triple buffering"
Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default).
The major difference in the technique we've described here is the ability to drop frames when they are outdated. Render ahead forces older frames to be displayed. Queues can help smoothness and stuttering as a few really quick frames followed by a slow frame end up being evened out and spread over more frames. But the price you pay is in lag (the more frames in the queue, the longer it takes to empty the queue and the older the frames are that are displayed).


Hope you guys found this educational. Most people don't know what triple buffering is, and I didn't learn about it until 15 years after initially seeing it as an option in my games. It's probably because most games these days are DirectX, and most people haven't been fascinated with the idea of frame sync on PC until FreeSync and G-Sync became a thing.
< >
Showing 1-2 of 2 comments
Per page: 1530 50

Date Posted: Dec 1, 2017 @ 6:41pm
Posts: 2