Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
For me it works best to enable VSYNC on LossLess Scaling via AMD/NVIDIA Driver settings, disable it in Game, disable it in LS settings, enable "allow Tearing". But you have to experiment with it a bit to find the best fit.
For FrameGeneration:
You need to set the FPS to half of your monitors refresh rate - for instance with Riva Tuner., sometimes Games also have an option for it. If you have 60hz monitor, cap the game to 30hz if you want to use FrameGeneration
In frame generation, I always lock the FPS at 72 because my monitor is 144Hz, using Riva or within the game when it allows me to.
However, I am using Vsync in game and I always have doubts about whether it is producing any effect or not.
I use Gsync compatibility, with vsync off in game and vsync in nv control panel turned on. Idk why I have to enable it in lossless scaling also, but what do I know.