Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
If you've seen our UFO, you know us -- www.testufo.com
I agree -- it's actually fairly simple to support FreeSync/G-SYNC
For software developers, it's just simply supporting VSYNC OFF mode. When the drivers use VRR, it just automagically works.
SOFTWARE DEVELOPER INSTRUCTIONS:
Just make sure Direct3D Present() timestamps is time-relative to game times.
For FreeSync & G-SYNC, the monitor refreshes the instant the frame is presented by the game engine, so the frame should be time-accurate to the frame presentation time -- while not worrying about exact intervals between frame presentation times.
This erases framedrop stutters, creating seamless stutterless frame rate changes, as seen at https://www.testufo.com/vrr (software-based simulation of G-SYNC and FreeSync).
Some more detailed know-hows:
https://blurbusters.com/gsync/how-does-gsync-fix-stutters/
https://forums.blurbusters.com/viewtopic.php?f=2&t=6273
https://forums.blurbusters.com/viewtopic.php?f=22&t=4710
But in reality, it's really easy. Just make sure frame presentation timing is in parallel with game time, preferably to within less than a millisecond (best-effort). Most game engines that support a reliable "VSYNC OFF" operation (in the proper way), already work correctly when drivers enables VRR,
With VRR, the frame rate is the refresh rate, and the refresh rate is the frame rate -- as long as the frame intervals is within the refresh rate range of a variable refresh rate monitor. The hardware monitor is syncing to the software, instead of a fixed refresh clock.
Basically, allow the gametime to float (without fixed intervals between gametimes), and present the frames immediately on the spot. So if 12ms elapses between gametimes, 12ms should elapse between Present() or glxxSwapBuffers() timings (Direct3D or OpenGL). And if next frame interval is now 17ms, then Present() is 17ms later. Your Present() is controlling the timing of the monitor's refresh cycles in realtime, as long as the interval is within the monitor's VRR range. (But don't worry about that detail, the drivers will handle it automatically).
This is already a developer's "VSYNC OFF best practice" anyway, and if you've followed that already, then G-SYNC and FreeSync modes will automatically work very well.
If you're using an off-the-shelf engine, most of them now already support VSYNC OFF which is necessary for proper G-SYNC / FreeSync operation. Unity, Unreal, etc.
When the monitor/drivers have that enabled, the tearing disappears, and transfers over to a "perma-perfect-smooth" look despite fluctuating frame rate, just like how https://www.testufo.com/vrr can modulate framerates randomly without stutters -- that's the magic of variable refresh rate. 47fps VRR looks like perfect 47fps 47Hz VSYNC ON. And when the frame rate changes, like 53fps, then it looks identical to 53fps 53Hz VSYNC ON. And the "stutter-seams" between framerate-changes are erased by VRR.
Thus, each separate frame can have its own unique refresh rate -- which is equal to the time interval between the current Present() and the previous Present() -- and there can be hundreds of random intervals without stutter (Except during disk-freezes / system-freezes). Sometimes slight manual VRR optimization can be needced for extreme-large rendertime changes between adjacent frames, though some of that can be fixed by strategically delaying Present() to re-sync time-relativity to gametimes if the next rendertime went much faster than expected and the gametime assumed a longer rendertime -- but it looks like for this engine, it's simply just a minor "VSYNC OFF-support" modification without need for any other optimizations (initially) --
In Visual Studio, I have seen some engines suddenly go smooth with only about 15-20 lines of code change (excluding the game menu changes needed to add a "VSYNC ON/OFF" option)
Also: PLEASE UNCAP THE 120fps LIMIT
Also, please remove the 120fps cap. I'm seeing the game able to run >120fps on my high end system -- I'm able to run framerates exceeding 120fps often with high end NVIDIA cards. Doubling frame rates halve motion blur, and I'd like the frame rate to organically "float" from 50fps to 200fps with the 240Hz 1ms IPS monitor "floating its refresh rate" to sync to frame rate -- VRR range of 48Hz to 240Hz in realtime (with over 200 seamless refresh rate changes per second). It's much smoother than both ordinary VSYNC ON and ordinary VSYNC OFF
P.S. 240Hz and 360Hz monitors are not just used by esports, and you should futureproof the cap feature to support 1000fps 1000Hz, see https://www.blurbusters.com/1000hz-journey since ultra-high frame rates is one way an LCD can accurately emulate a CRT tube without needing impulsing/phosphor.
P.S. I'd be happy to test G-SYNC / FreeSync behaviour changes between successive versions of the game.
big thanks!
Hey, thank you very much for your detailed response, much appreciated.
The next update will include an infinite (1000fps) setting to remove the 120fps cap. This cap is only used by Unity if V-Sync is turned off. So this way it should properly support FreeSync G-SYNC as per your explanations.
Some reasoning behind the FPS limit: It was added to give players the option to reduce the framerate below their monitor's refresh rate to avoid their GPU's overheating. This happened to some users with very high GPU's which then became unstable. Why exactly, we weren't able to tell, as the game always ran great on our 2070s.
We will continue to improve this. Thanks for your offer to provide some testing.
Yes, I am aware of high framerate overheating some GPUs. That said, it's important to present the user at least the option.
P.S. Caps are still useful for VRR, to prevent framerates from exceeding the VRR range explained for various technical reasons at Blur Busters. However, let's just keep things simple -- it is already known that the Unity framerate cap doesn't always interact well with VRR and users can externally use an external frame rate cap too (RTSS). So keeping things simple:
Now that i'm aware you are using the Unity engine, the engine can be made G-SYNC/FreeSync compatible by doing the following:
1. Enable a Full Screen Exclusive mode (to bypass Windows compositor)
2. Framerate cap setting (which should now include "Disabled" or "Uncapped" or "Infinite")
3. Some engines work best with a separate VSYNC ON/OFF setting. However, users can force this externally through NVIDIA Control Panel especially if the game is running in full screen exclusive mode.
For users who want to enable VRR, configure full screen exclusive, uncapped, VSYNC OFF in the game menus. Then in NVIDIA Control Panel, enable G-SYNC. For AMD, enable FreeSync in both Catalyst Control Center and the monitor's menus. Then VRR works fine in Unity-engine games that has been programmed with (1)+(2)+(3)
There is a problem with using borderless full screen mode in Windows since Windows often tries to fight against VRR. However, if it is too much development effort to do (1), there is a workaround for G-SYNC users (not FreeSync users), for users doing NVIDIA Control Panel -> Setup G-SYNC -> Enable for windowed and full screen mode.
Then you only need to implement (2) and (3) as a software developer, which should be easy-peasy, since they're just Unity API calls pretty much, and Unity has improved their FreeSync/G-SYNC support over the years anyway.
When using Infinite, please use "-1" for the Unity "targetFrameRate" API, do not use 1000. The -1 is the uncapped infinite frame rate setting, and is the one that Unity triggers its G-SYNC/FreeSync behaviors.
The reason you shall not use 1000 is because even 1ms sometimes becomes human-visible, because at 4000 pixel/sec mouseturn, 1ms translates to a 4 pixel stutterjump for a 1ms gametime:photontime error margin, and that becomes human visible at stratospheric refresh rates when display motion blur is smaller than the stutterjump. It is amazing how milliseconds become human visible: https://forums.blurbusters.com/viewtopic.php?f=7&t=6808
So, never, use a 1000 cap -- and if you're using any busywait techniques for framepacing, use QueryPerformanceCounter(), RTDSC, or other microsecond-quality counter, not millisecond-granularity synchronization.
However, you're probably just letting "Unity do its thing", as an indie developer, so for simplicity, this is your absolute minimum modification.
MINIMUM MODIFICATIONS
1. Add a "Infinite" setting to the frame rate cap. When this is selected, set Unity API ".targetFrameRate = -1;" combined with "QualitySettings.vSyncCount = 0;"
FULLSCREEN MODE
VRR works better with full screen exclusive. Use the Unity API "ExclusiveFullScreen" setting. You might want to add a third setting, "Windowed / Borderless Fullscreen / Exclusive Fullscreen" to give users the choice.
----- OPTIONAL READING BELOW -----
Everything else below the line is optional, but highly educational to a software developer who wants to optimize better for less stutters & better VRR.
OPTIONAL #1: Match Hz Setting
I would ideally, like to see a "Match Hz" setting alongside with an "Infinite" Setting, which is simply Unity's "QualitySettings.vSyncCount = 1;" (this will ignore the frame rate cap and use the current monitor's Hz as the cap). This will automatically smooth things out to framerate=Hz with a little bit of input lag, but some people love the fluidness, especially when using powerful GPUs that keeps framerate near Hz, for non-VRR
OPTIONAL #2: Debug your stutters (HINT: It will help non-VRR too!)
1. Add a gametime:photontime divergence calculator. It's mathematically simple.
It's simply comparing the gametimes versus the real system clock as in following calculation:
A. Difference between two gametime timestamps (current frame and previous frame), you can use Unity API "Time.deltaTime"
B. Difference between two QueryPerformanceCounter() of the device.Present call (current frame and previous frame), basically the moment the frame is presented. Just do a QueryPerformanceCounter() (or read other microsecond counter) IMMEDIATELY before your Unity API "device.Present" call (if that's your workflow).
C. Now, calculate difference between A and B and log this every frame. This is your gametime:photontime divergence for VRR displays (i.e. stutters not fixed by VRR, as well as stutters that become worse than it should be for VSYNC ON).
D. Render the graph every game frame, onto your game screen, as an overlay -- a rolling window of the last 500 previous instances of (C) for the last ~500 frames. As you play the game, the graph (in debug mode) will be a nice stutter-visualization.
E. This gives you realtime visual debugging of stutters. If the graph is smoothly modulating, GOOD JOB! You've done a great job, and VRR will be absolutely beautiful. If the graph has hundreds of peaks and valley, especially sudden 10ms adjacent-frame frametime changes, then there's something wrong, there's some optimizing work you need to do -- a simple turn (turning left and right) shouldn't have 10ms-timescale gametime:photontime divergences.
That will be gametime:photontime divergence for VRR -- basically VRR-unfixable stutter, as well as stutter that becomes bigger for ordinary VSYNC ON too. Optimizing this debug graph will reduce stutter for all sync technologies, while allowing VRR to do its job.
This can be a simple debug logging for adjacent-frame rendering time volatility (render time differential between two adjacent frames). Just QueryPerformanceCounter() or grab timestamps, time-difference between them. If hugely volatile (rather than a smooth modulating graph) spread out the processing to keep rendertimes reasonably consistent for same-view rendering (to de-stutter simple turning, etc) -- which may include multithreaded processing, distributing pop-in rendering and disk loading in smaller increments or in separate Unity job system thread, etc.
Normally simple changes is all you need to do; but if you're wanting to push the limits of improving VRR quality (understanding this helps a bit) -- The demanding framerate changes can still go stutter-free with the VRR becoming the shock-absorber for stutters. But this can fail if gametime-vs-Presenttime randomly diverges because of hugely volatile rendertime in between.
FUTURE PROOFING NOTE
1ms granularity still has human-visible effects
Now, do not assume 1000 granularity is good enough. We've since determined that 1ms frametime error can still have human-visible stutter, since 1ms = 1 pixel of misalign per 1000 pixels/sec motion. On an ultrahigh Hz display, at ultra high resolutions, when motion blur is so tiny, that tiny stutters are no longer hidden by display motion blur -- 1ms translates to 4 pixel stutterjump at 4000 pixels/second. So, milliseconds matter here. Always use microsecond timestamps, clocks, timers, etc, throughout your software, don't use legacy millisecond-granularity timers, if you use any. (Fortunately Unity no longer uses millisecond-granularity timers, but don't add your own).
HOW MUCH RENDERTIME VOLATILITY IS OK?
It's OK to do things like 10ms, 11ms, 10ms, 9ms, 10ms (type of frametime rrendering modulations), and smooth turning will usually have smooth modulations in rendertimes. But peaky adjacent-rendertimes like 7ms, 20ms, 27ms, 9ms, 17ms, all will have stutters that human-visibly show through VRR, which may be indicative of a requirement to do a slight scenery-processing rearchitecture to shock-absorber adjacent-frame rendertimes. That amplifies VSYNC ON stutter, VSYNC OFF stutter, and creates stutter sometimes unfixable via VRR. Now, stutters at startup and switching between scenery is often unavoidable.
SMOOTH RENDERTIME VOLATILITY:
So, just make sure your rendertimes modulate reasonably smoothly, so that gametime:photontime stays relative sync (to sub-millisecond levels, ideally). Sudden adjacent-frame rendertime changes (e.g. scenery pops, disk loads, garbage collects) can inject gametime-versus-Present()-time time non-relativities. Most engines will naturally have smooth rendertimes increase/decreases as you turn around, from less complex views to highly complex views -- it's when sudden rendertime changes (e.g. 1ms rendertimes suddenly becomes a 20ms rendertimes, because of a disk-loading stutter or a very huge garbage-collect event), then stutters will still show through VRR. But as long as gametime:photontime stays in relative sync for adjacent frames to sub-millisecond changes, stutters are generally human-invisible below the human detection noisefloor.
NOTE
There may be some other API flags you may have to change in order to get things to go suddenly smooth on VRR, but since you are Unity it should be quite straightfoward to support G-SYNC and FreeSync -- minimal programming modifications.
https://forums.blurbusters.com/viewtopic.php?f=22&t=7158&p=53976
Comments are welcome!
Thanks. The new update just went live. Maybe you can perform the tests you mentioned.
Exclusive fullscreen can be set via a command line option, as explained in the update notes.
https://steamcommunity.com/games/746850/announcements/detail/2541668199210643891
P.S. this setting "-window-mode exclusive" works fine for all "Unity" games?
Not sure if that works with all Unity games, probably depends on the version used.
*** INSTRUCTIONS ***
*** INSTRUCTIONS ***
G-SYNC now works in Cloudpunk! Goodbye stutters...
It only seems to works properly in full screen exclusive mode. The stutters are 90% gone; there are still minor microstutters (and the disk loading stutters) but 90% of stutters feels gone.
1. Configure Steam Launch Options to this screenshot (with command line option)
https://blurbusters.com/wp-content/uploads/2020/06/cloudpunk-vrr-support.png
2. Set VSYNC OFF in game menus
3. Set Framerate cap to 1000
Yep, much smoother on my G-SYNC monitors. It definitely looks like VRR is fixing many of the mouse/keyboard scrolling stutters.
Obviously, configure your drivers accordingly (G-SYNC / FreeSync mode) and monitor (G-SYNC / FreeSync mode), but the game's VRR ready now.
I just A/B tested, and yep -- definitely G-SYNC and FreeSync is working. There's minor ongoing microstutter which can probably be fixed with future optimizing, but the majority of mid-scrolling stutter is gone! So your engine work is sufficiently adequate enough to be benefitted by VRR stutter-erasure.
EASIEST TEST CASE #1:
Spinning in place in your fast-turning flying Cloudpunk car. Hold down "A" or "D" key to spin in place. Watch for stutters as you spin in place.
EASIEST TEST CASE #2:
It's quite visible in platformer mode (Hit "X" key while outside car) with WASD keys with "A" and "D" keys on foot ("X" keypress), watching foreground objects (like bright lamp poles) scroll sideways in foreground while I run sideways at the far background. As the screen horizontally scrolls, watch the stutters.
G-SYNC TEST "A'
VSYNC ON, 120fps cap, default borderless windowed
Observation: Visible rough stutter just like previous version
G-SYNC TEST "B"
VSYNC OFF, 1000fps cap, exclusive mode (-window-mode exclusive)
Observation: Much smoother, the erratic stutter is gone
Assessment: SUCCESS
________________________________________
RECOMMENDATIONS FOR FUTURE:
Further improvement for next version (no rush):
(A) Other modifications recommended; please add "-window-mode exclusive" behavior to the in-game menus if possible. This will make it easier for future users to successfully enable VRR. If possible, please have three settings for DISPLAY MODE in the game menus, if possible:
DISPLAY MODE = "WINDOWED", "FULLSCREEN", "FULLSCREEN EXCLUSIVE"
You can use alternate terminology (some games uses this), such as:
DISPLAY MODE = "WINDOWED", "BORDERLESS", "FULLSCREEN"
There are pros/cons of exclusive versus borderless, in many contexts (streaming behaviours, latency behaviors, VRR compatibility, ability to move mouse to a 2nd monitor, changes to Alt+Tab behavior, etc), which necessitiates providing both options.
(B) Internally in your game engine, please make sure that the 1000 setting represents ".targetFrameRate = -1" (uncapped) rather than ".targetFrameRate = 1000"; this may be able to reduce stutters even further.
See my explanation of how 1ms sometimes mathematically becomes human visible (previous post) on some ultra-high-resolution displays that has unusually low motion blur, where blurwidth (thickness of motion blur) can be smaller than stutterwidth (stutter jump distance) -- this futureproofs your game against a well-known Blur Busters "Vicious Cycle Effect" (bigger displays, higher resolutions, faster GPUs, less motion blur, faster pixel response, higher frame rates, wider FOVs) that collaboratively amplifies stutter visibility of formerly-invisible stutters.
I can confirm that the game is much more enjoyable on high-Hz monitors on high-end GPUs now. Really smooth, most stutters are gone except for diskload-type stutters, garbage-collect stutters, and sudden-scenery-complexity change moments -- usually only once every few seconds.
240Hz 1920x1080p G-SYNC Monitor Test:
Oh, and if you decrease detail levels slightly (Medium) on a GTX 1080 Ti Extreme, the game really, really, really runs a real smooth 240fps on 240Hz G-SYNC monitors now too -- with only 1/4th the motion blur of 60fps at 60Hz. The framerate sometimes dips, but stays at 240fps most of the time. As I have some 240Hz 1ms IPS panels, they are the next best thing to CRT tubes (in low-motion-blur motion quality).
165Hz 2560x1440p G-SYNC Monitor Test:
The game runs at a full 165fps 1440p smoothly in VRR mode in Medium Detail mode on a GTX 1080 Ti Extreme. Increasing detail levels to Maximum stays smooth, though framerate falls somewhat below 165fps especially in complex scenery on foot with lots of avatars & glowlights -- but remains mostly smooth, with VRR erasing most stutters.
There may be a smidge more optimizing but I'm happy with the change; I'd just highly recommend that the game should directly implement Full Screen Exclusive menu option directly in the menu if possible -- since that is the mode where VRR performs best.
Remainder of stutters (the erratic stutters that happen once every few seconds): That said, I think you can eliminate even more stutters, since the changes removed a lot of "stutter noisefloor", and now the remaining stutters should be much easier to debug (if you decide to expend the effort, with diagnostics / etc) since the stutters should much more visibly align with spikes in frametime graphs (you can even us RTSS as a realtime frametime analyzer). From a programmer point of view -- it is amazing how when the stutter noisefloor lowers further, the other formerly-invisible stutter weaklinks show -- but I'm not complaining at this time; Cheers.
COMMENT ON ULTRA-MINOR STUTTERS DURING WALK MODE:
(that only appeared during this new version with "lower stutter noise floor")
If I switch to 3200dpi mouse setting + 1000 Hz poll, the mouseturns become super silksmooth. Mouseturns equally as smooth as keyboard turns.
I notice that:
- Mouseturn is super silk while in flying car
- However, mouseturn has ultra-minor microstutter when walking on ground in first-person view
But this disappears during:
- Mouseturn while standing on a moving elevator. Mouseturns is super silk
The ultra-tiny microstuttering during walk mode disappears when the elevator starts moving, but reappears when the elevator stops moving. Is this because some kind of game logic is starting/stopping whenever you're standing on a moving elevator?
Test by:
1. Make sure your mouse is high resolution mode (1000Hz poll, 3200dpi)
2. Use VRR or VSYNC OFF
3. Stand on elevator.
4. Make the elevator move up and down.
5. Mouseturn left/right
6. You'll see mouseturns more smooth whenever the elevator is moving than when the elevator finished moving
Anyway, impressive how I can now see VRR-bypassing microstutter in a "visual stutter debugging by human eye", since these are the remaining gametime:photontime divergences left over after adding VRR support to your game. (You might wish to do some debugging on this).
Nontheless, still much smoother; 90% of stutter appears gone now. It's MUCH more enjoyable on a G-SYNC monitors now!
- Flying car scenario
- Turning on foot while standing/walking on moving elevator
Here's an approximate TestUFO stuttersimulation ("Perfect")
https://www.testufo.com/stutter#demo=smooth
But there is some leftover developer-optimizable microstutter during:
- Turning on foot while standing/walking
Here's an approximate TestUFO stuttersimulation ("Microstutter"):
https://www.testufo.com/stutter#demo=microstuttering
This TestUFO animation was JavaScript-programmed (by me -- I'm creator of TestUFO) to inject intentional ~5ms stutter, which is obviously human-visible if you A/B compare both of these two TestUFO links back and fourth. Although I don't know what the stutter error margin is currently in Cloudpunk, it looks roughly similar to this amount of stutter while running around.
(These animations are approximate comparisions of stutterfeels during mouseturns in Cloudpunk, to help Ion visualize a visual stutter-reference of what I am witnessing while running around on feet)
Since culling is happening regardless whether an elevator is moving or not, this confirms that this specific stutter weak link is not being caused by a culling algorithm; I changed culling to maximum and minimum with little effect; this type of 'stutter' is well above the stutter noisefloor caused by culling algorithms.
Before the 2020-06-27 changes that really smoothed stutter for G-SYNC/FreeSync (in full screen exlusive), it is now much easier to watch the next stutter weaklink, since the stutter noisefloor is now lower. Flying around is perfect silk smooth (with very occasional stutter once every few seconds), but running around microstutters with tiny stutterjumps about 10 times every second (or thereabouts), creating a 'coarse' motionfeel.
It's MUCH better than it was before (thanks to VRR essentially shock-absorbing many of the stutters), but it looks like a very minor game optimization job (I'm amazed how much stutterfix even a few 1-line changes can do -- edited in the correct places in the code).
So at 165Hz+, there's now a 10x microstutter difference between mouseturning on a moving elevator versus on stationary elevator / stationary floor on foot.
(Relative: Stutters roughness was 100x worse before the recent changes that added G-SYNC/FreeSync compatibility)
Often stutters is a composite merged stutter of all the stutter weaklinks. In my collaborating feedback -- we've just erased a few weak links with VRR compatibility, revealing new stutters below the old stutter noise floor. This one is extremely clear (i.e. 10x stutterdifference), which suggests a new low-lying-apple minor-code-change optimization opportunity.
This is even visible on horizontally moving elevators too (Avalon Heights), so culling is already occuring as faraway scenes fades etc -- yet it's not stuttering at all (10x less stutters than mouseturns on stationary ground)
So as a software developer experienced in visual stutter-debugging, I now think it's probably not a culling algorithm -- because it is so supersilk smooth while flying-car or standing/running around on a moving elevator.
I am not sure why this happens, but some theories from a software developer point of view:
(A) Mouse/keyboard inputreads are not submillisecond time-relative to gametime; or
(B) Some logic is being enabled only when walking around on stationary floors
(C) Some extra processing is occuring in the main Unity thread that is jittering rendertimes (time elapsed between gametimes and Present times).
The stutter exists at all detail levels from Simple to Maximum, so it appears to potentially be a very simple fix that may be lurking in a non-obvious place (now that the stutter noisefloor just lowered and remaining stutterfeels are more easily debuggable)
As in my earlier posts, Millisecond Matters ( https://forums.blurbusters.com/viewtopic.php?f=7&t=6808 ) and sometimes the changes are easier-than-expected, especially for off-the-shelf game engines like Unity -- just non-obvious changes (like "oops, I apparently buffered that inputread, I should now fix that to do just-in-time inputreads closer to the gametimes or presenttimes") and other stuff like that.
Often stutters is a composite merged stutter of all the stutter weaklinks. In my collaborating feedback -- we've just erased a few weak links with VRR compatibility, revealing new stutters below the old stutter noise floor. This revealed stutter is extremely clear relative difference (i.e. 10x stutterdifference), which suggests a potential low-lying-apple minor-code-change optimization opportunity (hopefully).
If you can fix this stutter, we'll be writing a Blur Busters article on how to de-stutter and futureproof a Unity game (scale infinitely in the Vicious Cycle Effect) in ways that many developers often overlook -- and Cloudpunk would be a flagship example game since we're now capturing the end-to-end developer cycle of simple stutter low-lying-apple stutter optimizations -- could bring you some more sales since the article on the main site would effectively advertise Cloudpunk :)
Now --
So, maybe doublecheck the engine to figure out why it's microstuttering while keyboard/mouse movement on running around on feet, where it does during flying-car or moving-elevator moments.
(Since you're running culling anyway during flying car and during elevator movements, right?)
It appears that my "running-around" stutters seemed slightly more visibile on my 165Hz monitor than 240Hz monitor and 120Hz monitor, which is weird because it doesn't matter with VRR.
The car & the moving elevators are silk smooth now though.
_________________________
Interesting.... With RTSS, I just discovered a 60-divisibility stutter issue
RTSS 60fps cap
Elevator moving: SMOOTH
Stationary Floor: SMOOTH
RTSS 90fps cap
Elevator moving: SMOOTH
Stationary Floor: COARSE
RTSS 120fps cap
Elevator moving: SMOOTH
Stationary Floor: SMOOTH
RTSS 160fps cap
Elevator moving: SMOOTH
Stationary Floor: COARSE
RTSS Uncapped
Elevator moving: SMOOTH
Stationary Floor: COARSE
_______________________________
Interesting.... beat frequency stutters suggest a divisible-by-60 behavior somewhere.
(All of the below uses uncapped limit in game, since RTSS cap is much more clock-accurate than Unity cap)
Perfect smooth caps:
RTSS 60fps (on any >60Hz VRR)
RTSS 120fps (on any >120Hz VRR)
RTSS 180fps (on any >180Hz VRR)
Stuttery caps near 60:
RTSS 58fps (2 big stutters per second)
RTSS 59fps (1 big stutters per second)
RTSS 60fps perfect silk
RTSS 61fps (2 big stutters per second)
RTSS 62fps (1 big stutters per second)
Stuttery caps near 120
RTSS 118fps (2 small stutters per second)
RTSS 119fps (1 small stutters per second)
RTSS 120fps perfect silk
RTSS 121fps (1 small stutters per second, for >120Hz displays)
RTSS 122fps (2 small stutters per second, for >120Hz displays)
As I'm familiar with harmonic/beat frequency stutters (fps-vs-Hz, pollrate-vs-Hz, engine-vs-Hz, engine-vs-fps), there might be logic in your engine clamping or rounding-off movements to divisibles of 60.
Also, the limits in game (60, 120) are rougher stutter than the limits in RTSS.
So RTSS is helping me enjoy game better. I prefer 120fps smooth than 165fps microstuttery.
To debug this, install RTSS (microsecond-accurate external framerate limiter)
https://www.guru3d.com/files-details/rtss-rivatuner-statistics-server-download.html
Debug 60fps framerate, since stutter debug seems to be occuring more easily here.
TEST RTSS 60fps
1. Enable G-SYNC on your monitor
2. Launch RTSS and keep RTSS settings open, set it to 60
3. Launch game, and configure ingame: 1000fps cap, VSYNC OFF, fullscreen exclusive
4. Stare at floor while strafing (A and D keys), it is SMOOOTH
TEST RTSS 58fps
5. Alt+Tab to RTSS (screen will black out for 1 second, since it's fullscreen exclusive)
6. Change RTSS cap to 58, then Alt+Tab back to Cloudpunk
7. Stare at floor while strafing (A and D keys), it is STUTTERY (58fps choppy)
8. Now go on elevator or enter car, mouseturn, strafe, etc, it is SMOOTH (58fps has no stutter thanks to VRR)
_________________________
DEVELOPER POINT OF VIEW
This seems like a timing issue or math issue keying on a "divisible-by-60" quirk. They can be any of these possibilities in your game logic that only occurs while moving around on stationary floors (not flying car, not on moving elevator) ?
- divisible-by-60 mathematics in your engine
- timing input reads on a divisible-by-60 event
- timing Presents() on a divisible-by-60 event
- timing other "60" or "16.7ms" behaviors or divisibles thereof;
- doing inputreads in different events that have different timing logic;
- old well-intentioned 60Hz stutterfixes that now interacts badly with VRR
- doing input reads in 60Hz-timer physics-processing loops (which should run independently of inputreads), rather than inputreads synced with frame renders/presentation
- etc.
One example is if different pats of game logic has different flow (i.e. inputreads from just-before frame render/present, VERSUS just-after frame render/present). You might have a game physics loop that runs at 60 ticks or 120 ticks per second. If this is the case, to fix this one; copy the inputread timing workflow you use for car and elevator. Remember, "Milliseconds Matter" (on a timing relativity basis) here, as mentioned earlier.
There are infinite reasons why this happens, most of which are sometimes simple modifications (especially if the rest of the game is working silk smooth -- as is the case here)
_________________________
TEMPORARY WORKAROUND FOR 165Hz GSYNC USERS
Temporary workaround for users to go silksmooth during running around: Install RTSS and use a cap divisible by 60. For 120Hz monitors, use a 120 cap (RTSS 120, Cloudpunk 1000 VSYNC OFF). For 165Hz monitors, use a 120 cap (RTSS 120, Cloudpunk 1000 VSYNC OFF). This makes things go silk smooth.
LONG TERM SOLUTION
Hopefully this mircrostutter weak link can be traced by the developer; this is fun Stutter Sherlock Holming (hope I am helping narrow causes down)
I just helped the developer fix even more stutters; the next version will be even smoother (fewer stutters). You're welcome, gamers!
For developers fascinated about this:
1. Do NOT configure Cinemachine to SmartUpdate, it needs to sync to Update instead of FixedUpdate. Camera can’t be FixedUpdate or SmartUpdate for VRR
2. Use a temporary 50fps or 55fps cap for stutter debugging VRR. Any framerate not divisible by Physics Rate (60) is a great visual VRR-debugging framerate. If stutters disappears during mouse/keyboard movements, good job! (55fps cap should looks like perfect 55Hz on a VRR display).
Most cheap gaming monitors at least supports FreeSync 48Hz-75Hz, so 55fps is a good cheap office debugging / QA department standard, because many testers are issued only garden-variety 60Hz screens, but some of those already support FreeSync 48Hz-75Hz.
Even most fixed-Hz 60Hz screens can do 50Hz custom-rez (create the custom resolution in NVIDIA Control Panel). So you can still use a crap office monitor, to predict stutter-fixes for VRR and high-Hz.
The moral of the story is *IF* 55fps looks smooth despite 60Hz physics, it probably will look smooth at all frame rates & VRR framerate fluctuations, so the "55fps test" is a good Unity-engine predictor of High-Hz compatibility & VRR compatibility.
______________
UPDATE #2:
I think this is the definitive G-SYNC / FreeSync compatible release:
https://steamcommunity.com/games/746850/announcements/detail/2541668199224753633
so, how about Latency Mode? (nvidia control panel) what you'r shoose?
-off, on, or ultra?