Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
"You don't have to limit your fps in these exact multiples. You can still use any fps limit within the limits of your GPU. Eg. 65fps limit on 144hz display is completely fine."
You should explain that this can lead to uneven frame pacing and jittering, in that case use exact multiples.
"3➽ Try enabling LSFG again when around 60% GPU load."
The overhead that is needed is highly dependent on the GPU, framegen setting and resolution. A better advice would be to calibrate the graphics settings while framegen is running with help of Draw FPS and/or RTSS, so you can see the immediate results and don't need to guess.
That's not correct. It's a 50% fps raise from 40 to 60 after all. On my system at 1440p in Witcher 3:
40/120 x3 Performance Mode Off: 50% GPU usage
60/120 x2 Performance Mode Off: 62% GPU usage
60/120 x2 Performance Mode On: 55% GPU usage
Like it said, that's a bad way to optimize things, a good detailed guide should contain the following basic informatons, reuse them if you like:
Decide which framerate you wanna reach in a certain game, if it's for example 120fps check in your display settings if your monitor supports that resolutions, then set it 120hz., if not, use another hz setting. The correct hz setting and a correct framerate cap is important for frame-pacing and sync reasons.
Cap your fps to half of your refresh rate either in-game, with the driver or RTSS if you wanna use x2 framegen, or a third for x3, so it's 40/120 for x3 or 60/120 for x2. Simply try out if you prefer x2 or x3. If you use x2 try Performance mode On, it uses less GPU resources and looks good if you have a decent base framerate, for x3 it's best to not use Performance mode.
To see how well LS performs enable the Draw FPS counter in LS and check the numbers in-game. To stay with the 120fps example, if you set your monitor to 120hz and a framecap of 60fps with x2 framegen, a 60/120 framecounter would mean that your system can handle it, the first number is the base framerate, the second the fps you see.
If you only see like 40/80 it means you're GPU-limited (or in some games CPU-limited ), it that case either lower your graphics settings or resolution to reach the desired 60/120. If that doesn't bring the desired results, then use a lower refresh rate, like 100hz, then set the your framecap to either 33fps for x3 or 50 for x2 for that 100hz. Rinse and repeat, until you find the perfect settings for your system and game that you wanna play.
Frame-pacing is not as important as it is claimed to be. I have used it on 3 different systems so far and the difference between exactly halving fps limit to a custom limit has been almost not noticeable.
There is also absolutely no need to include "enable the draw fps" because it is on by default. How you explain it is only giving all the information mashed together. For people to understand it easier, you need to divide information in simple sections and you shouldn't include more than one aspect on each section, which is what this guide intends to do.
The order the troubleshoot is also going from best visuals to worse. There is really no need to further change that or explain with more details. Actively monitoring definitely helps as you said which is what I changed but further change is unnecessary.
It's neither short nor simple. Nor is it detailed for the almost 8000 characters it has, and it contains/contained wrong informations.
That was all pointed out in this thread already, read the posts.
You are also missing out that x2 and x3 performance doesn't just mean the GPU usage. It also means the visual and output quality. You will have worse performance with x2 compared to x3 with same target. Not only there will be less frames to work with but there will also be a lot more latency and artifacting issues. So what I said stands true, you will have better performance with x2 option compared to x3, because LSFG will perform better.
Glad it's working for you, unfortunately this is not always the case. There have been a number of people I troubleshooted where g-sync did not disable itself and many claimed to have managed to make it working while using LSFG. In the end it was causing random fps fluctiatuons because LSFG was trying to decide whether g-sync frames or in-game frames were the real ones to take it as base for framegen. It was practically the same for freesync and some other apps like AMD Instant View feature or the Geforce Experience's Shadowplay highlight feature.
I won't mix up the two.