Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
LS1 has a much higher GPU load than NIS, maybe with LS your GPU was overworked in that test? Try to keep the GPU load at around 85% to minimize lag.
I have a very decent gpu (4070 Ti) , didn't noticed much difference between scaling methods regarding input latency ? But if i have to choose , i'd say NiS is the fastest.. Wich make sense, since i have a Nvidia gpu ..
Edit: forgot to mention, iam controller player (not mkb)..
If your GPU is maxxed out (which often happens with unlocked framerate), LS will take longer to generate your frames and you will end up with way more latency, by enabling vsync in game, you lock the game's framerate to your screen's refresh rate, so your GPU has more "waiting time" it can dedicate to Lossless Scaling.
This might also be why LS1 gives you worst input lag, you might be on the edge of maxxing out your GPU, and LS1 is just heavier enough than NIS to push you over the limit.
You can max out an gpu serveral way's, either use epic settings ingame and/or using Framegen / Upscaling on top of it .
But even when my gpu isn't maxed out, i noticed a lil difference. Depending where iam at ingame...
Also, its my understanding that Vsync acts as a buffer and queues frames, no?
a.k.a double buffer / Triple buffer ...
Upscaling using LS will always add a bit of latency, because the software works separately from your game, it's not an integrated solution. so it needs to wait for the frame to be fully completed and displayed by your game/software, THEN it upscales it before finally (re)displaying it on top.
But the lag introduced by image upscaling, if your gpu is powerful enough and isn't maxxed out, usually isn't too noticeable.
Also yes, Vsync will add latency, if implemented correctly, 1 frame of lag for double buffered, and 2 for triple buffered (but with the added benefit of not seeing your framerate drop to half if you can't keep it high enough)
The odd part about all this is ...
I benefit the most when using Vsync regarding input lag ..!?! It should be the other way around, → vsync = disabled .. 🤔
The difference is barely noticeable , but it's there imo ..
that's what iam trying to figuring out..
Edit: i suspect some of my custom nvidia 3D settings might have something to do with it.. ie: vsync, latency mode, etc..