Lossless Scaling

Lossless Scaling

Latency on AMD card
Hi everyone! I used LS almost everyday for the past 2 years, i really like the software, but i find too much latency when using it below 60 base fps, like in the last of us where i can't reach 60 base fps + lsfg without turning all the settings to low. I have a vega 64 16gb and i play at 3440x1440 (21:9), obviously i use fsr balanced to get more fps and i can achieve 45 fps with medium-high settings.

is there a way to reduce lsfg latency? i cannot use amd anti lag through software, because i need pro drivers that don't have the option.

my profile settings:

LSFG 3.0 x3
50% res scale
sync mode: off, allow tearing
max frame latency: 1
capture api: DXGI
scaling type: off

I would really appreciate tips on the matter, and I also want to share my findings about latency using flm (frame latency meter)

60 fps native: 29-32 ms input latency
60 base fps + lsfg x2: 36-42 ms input latency
60 base fps + lsfg x3: 42-45 ms input latency

i tested this at 60 base fps using fsr performance and low settings (the last of us).

thank in advance everyone for the help!
< >
Showing 1-3 of 3 comments
I remember reading that for AMD cards, you actually want to raise the "max frame latency" setting a bit, worth playing around.
Also, make sure GPU utilization is not over 85%.
yep, will surely try that, thanks.
so, maybe i found what the problem was, i wrote a detailed explaination on this thread

(https://steamcommunity.com/app/993090/discussions/0/598517164755968133/)

in short: the problem fo the last of us was the frame pacing, this game does not really provide a flat fps line, so the input lag spikes up sometimes and it feels more laggy than it really is, for example in hell let loose, where i can get a flat fps line, i got an increase of 5 ms input latency with 60 base fps x2, and on x4 i got an increase of 10ms input latency from the base. (see detail in the other thread)
< >
Showing 1-3 of 3 comments
Per page: 1530 50

Date Posted: Jan 29 @ 12:27am
Posts: 3