Lossless Scaling

Lossless Scaling

x2/3/4 latency test using Frame Latency Meter
Legutóbb szerkesztette: Gizzmoe; 2024. aug. 14., 8:55
< >
115/18 megjegyzés mutatása
That's a very neat program, iv'e been using Intel Presentmon to get a ballpark latency figure. Definitely going to have a fiddle with FLM.

The results in the video confirm my suspicion that there is no difference between x2, x3 and x4. There wouldn't be, you need the same data for x2, x3 and x4, namely frame A and frame B.

I think the latency difference between them that people perceive is due to x3 usually being used with a lower base-framerate and the larger input-output-disconnect of higher interpolation factors. Your frames/eyes tell you to expect a latency typically associated with 120fps but your brain perceives 40fps input- and present-latency, plus the added FG-latency.

Very cool find!
Is it different than SK built in latency analyzer?
Good question, I haven't tried SKs analyzer yet.
Abdulah eredeti hozzászólása:
Is it different than SK built in latency analyzer?
I reckon SpecialK has the ability to more accurately measure latency without FG, as it has the ability to inject and presumably read Reflex-markers. I don't believe Frame Latency Meter does.

Though Frame Latency Meter does measure visually from generated mouse-input to detected movement. If SpecialK doesn't measure visually, i reckon it makes FLM more accurate for measuring the latency added by frame-interpolation. For that reason i'd use it to measure changes i'm making to limiting/syncing-related settings.

So;
-SpecialK presumably more accurate for estimated total latency, at least without FG.
-FLM presumably more accurate in measuring the latency added by FG.
When I was testing with SK latency it made a huge jump when I turned on LSFG I think it is actually measuring it?
Abdulah eredeti hozzászólása:
When I was testing with SK latency it made a huge jump when I turned on LSFG I think it is actually measuring it?
Nice, you could compare the added latency measured with SpecialK and FLM.
FLM isn't that complicated to use, just make sure you keep the FG-mode enabled in FLM regardless of FG actually being on, it stumbled on my end without it.

You could even compare to Presentmon if it interests you. And on top of that there's Reflex on a GSYNC-moduled monitor for the gold standard if you have access.
Special K is such an amazing little tool. I always have it installed on every PC. I know some think it's bloatware.. it's really not; that was ironed out years ago. Tbh it's new HDR emulation is second to none. I find myself using it more often than RTX HDR because A) Nvidias method of using an overlay to inject it is completely brain dead and ends up glitching out in half the games/not continuously applying or lacking the rest of the shader spectrum because of compatibility issues.. and B) is so inconsistent getting it to work with Lossless/DLDSR.

The biggest issue I have with part A) is that the HDR Details shader it too damned good with RTX HDR+Vibrance and it's only useable in half the games.. without out it I find RTX HDR kind of lacklustre. Where as Special K + ReShade(though more steps are involved in setting it up)has way more accurate customisation. In fact I've had Special K configured HDR work better as far as implementation than some of the games native approaches(soo many games have bad HDR implementations with poor customisation).

Also Special K has so many other amazing features. Being able to enable resizable BAR/add Gsync into games that didn't come with it stock is also very nice.
Legutóbb szerkesztette: Xavvy; 2024. aug. 16., 8:54
Can also inject reflex (although based on latency analysis in SK in game reflex, vs RTTS reflex, vs NVCP reflex all pretty close)

But yeah SK HDR is insane, after Native HDR, SK HDR and RTX HDR go head to head, no reason not to use SK HDR for basically 0 perf hit + being able to work with LSFG
Abdulah eredeti hozzászólása:
Can also inject reflex (although based on latency analysis in SK in game reflex, vs RTTS reflex, vs NVCP reflex all pretty close)

But yeah SK HDR is insane, after Native HDR, SK HDR and RTX HDR go head to head, no reason not to use SK HDR for basically 0 perf hit + being able to work with LSFG

Yeah as insane as RTX HDR can be when you tweak it just right(I do believe that RTX HDR has a much higher ceiling for better HDR quality than SK but it requires completely different settings in every game and it takes too long to test it out and get it just right) it's just unreliable.. Like SK works all the time and it's easy to configure.

RTX HDR for sure has much higher ceiling for quality. Once you get it perfect for a game it nearly rivals even the best implementations.. but the caveats are huge. A) RTX HDR is wholly unreliable dog sh!t mess because of it's stupid af overlay method of application(Like really Nvidia? Ditch the stupid af overlay already) If it doesn't glitch out and crash it has issues applying in so many games reliably.

B) The performance cost of RTX HDR with appropriate must use shaders(RTX Vibrance and RTX Details)(yes this is actually a shader meant for RTX HDR/Vibrance, the guy who now makes ReShade shaders who used to work for them spilled the beans. Details was basically a configure shader built for RTX HDR/Vibrance.. they had RTX HDR/Vibrance literally 7 years ago but it was never put into release because it never worked properly) is quite literally insane.

I've had proper setup RTX HDR with the shaders look drop dead gorgeous better than some the best native implementations while at the same time costing basically the same as fuqqing Ray Tracing(like what the actual fuq is that).. No joke in this one game I had it looking so good for a bloody near 38% GPU cost on a 3080ti. Like what even is that. That's just not okay. It needs serious efficiency fixes because that's terrible and this number really varies from game to game. Some games it only costs 3-6% and some games it costs nearly 40% of your GPU.. that is so messed up I can't even..

So yeah they originally designed the Details shader for RTX HDR but it was then changed to use with everything because it was a solid shader and could help with auto HDR in games as well. So if you're wondering why now it's an unusable mess all the sudden.. that's why. They changed it back so that it syncs with RTX HDR/Vibrance again LOL.

Lastly C)RTX HDR doesn't work in a lot of games where as with tweaking there isn't a single game I haven't got Special K to work in. There is SOOO many games that RTX HDR either refuses to register in and if it does it refuses to register RTX Vibrance and RTX Details.. and if you can't use RTX Vibrance/Details you might as well not even use it because RTX HDR on it's own doesn't hold a candle to Special K HDR and yeah the biggest seller is obviously the fact that Special K is basically not far behind well done native HDR implementation with the performance hit of Auto HDR(which is in most cases sh!t).
Legutóbb szerkesztette: Xavvy; 2024. aug. 18., 2:47
Xavvy eredeti hozzászólása:
Abdulah eredeti hozzászólása:
B) The performance cost of RTX HDR with appropriate must use shaders(RTX Vibrance and RTX Details)(yes this is actually a shader meant for RTX HDR/Vibrance, the guy who now makes ReShade shaders who used to work for them spilled the beans. Details was basically a configure shader built for RTX HDR/Vibrance.. they had RTX HDR/Vibrance literally 7 years ago but it was never put into release because it never worked properly) is quite literally insane.

Wow had no clue on this, I thought we only use RTX HDR. RTX HDR on its own is a decent perf hit, so I imagine adding RTX Vibrance and Details is even more
Thats why i refunded. Nearly 1s latency on helldivers 2 for me, unplayble using LS.
Sevazinho☄ eredeti hozzászólása:
Thats why i refunded. Nearly 1s latency on helldivers 2 for me, unplayble using LS.
H*ly sh*t, how? What was your base-fps and interpolation factor? I get nowhere near that, i doubt i could if i tried.
Legutóbb szerkesztette: Spook; 2024. aug. 18., 13:21
Spook eredeti hozzászólása:
The results in the video confirm my suspicion that there is no difference between x2, x3 and x4. There wouldn't be, you need the same data for x2, x3 and x4, namely frame A and frame

Hm. Actually x4 should have the least latency, and x2 the most. With the same base FPS, x4 generates more intermediate frames and as a result the first of those intermediate frames needs to be shown sooner compared to x2 or x3, resulting in more immediate visual feedback. So less input lag.

And this how it actually feels to me. I use a 240Hz display and with a 58FPS cap (to stay well within g-sync range at 4x), 2x feels laggy, 3x feels slightly less laggy but difficult to tell, and 4x feels definitely less laggy.

116FPS base though with 2x to get 232FPS output feels even better. I would imagine that If I had a 480Hz monitor, 120 base with 4x would probably be even better.
I Am Not Amused eredeti hozzászólása:
Spook eredeti hozzászólása:
The results in the video confirm my suspicion that there is no difference between x2, x3 and x4. There wouldn't be, you need the same data for x2, x3 and x4, namely frame A and frame

Hm. Actually x4 should have the least latency, and x2 the most. With the same base FPS, x4 generates more intermediate frames and as a result the first of those intermediate frames needs to be shown sooner compared to x2 or x3, resulting in more immediate visual feedback. So less input lag.

I've since realised the same thing.

And this how it actually feels to me. I use a 240Hz display and with a 58FPS cap (to stay well within g-sync range at 4x), 2x feels laggy, 3x feels slightly less laggy but difficult to tell, and 4x feels definitely less laggy.

I do wonder how much of that can be chalked up to our brain associating a higher-FPS visual with a lower latency. As the difference between 60x2 and 60x4 should theoretically be a mere 4.17ms, or 1 frame at 240hz, if my thinking on this is correct. Which probably nears being unable to identify in a blind test.

Do you agree with the 4.17ms?
Legutóbb szerkesztette: Spook; 2024. aug. 21., 5:29
It's a shame it has so much latency. I tried to use it for MP games but you can immediately feel your mouse dragging.
< >
115/18 megjegyzés mutatása
Laponként: 1530 50

Közzétéve: 2024. aug. 13., 4:28
Hozzászólások: 18