Lossless Scaling

Lossless Scaling

Kota Jun 19, 2024 @ 6:44am
Does it put more strain on the CPU or less?
title
< >
Showing 1-6 of 6 comments
Gizzmoe Jun 19, 2024 @ 7:05am 
Way less. Here's a good example:
https://steamcommunity.com/app/993090/discussions/0/4338735599617229062/#c4338735599617677680

"I can cap all games at 55fps and EVERY game will run at 165fps now while making more use of my GPU instead of my CPU which has decreased my thermals massively.

Trying to run all game at native 165fps was having my 4080 laptop run at 80c pretty much nonstop with a even though I was using turbo limits and a solid af undervolt. Now running any game at max settings/165fps.. the laptop sits at 60c(CPU) and 55c(GPU).. so yeah. HUGE bloody difference. I think if you're on a mobile RTX 3000/4000 laptop this app is a must have for better thermals."
alumlovescake Jun 19, 2024 @ 10:53am 
More because using a lower resolution is making you more cpu limited.
Xavvy Jun 19, 2024 @ 5:55pm 
2
1
Originally posted by alumlovescake:
More because using a lower resolution is making you more cpu limited.

This isn't at all true when we're talking about frame gen. Interpolated frames will affect CPU usage in a very minor way in comparison to raw frame output. Your experience would be more CPU limited only in the case of pushing raw(non interpolated)frame-rates to the max of what your GPU/CPU can potentially output. For example. IF your CPU is only capable of pushing 120 fps in a game regardless of settings(this is your max frame threshold) .. bottleneck calculators are terrible at actually calculating this.

A good example is Darktide.. a fairly CPU limited game. An RTX 4080 with a 7800x3D is only capable of getting about just shy of 120fps in this game regardless of settings. If you lower settings all the way down and even with max up-scaling with prioprietary frame gen you will only every see maybe at best case a stable 90-120fps while sacrificing all visual fidelity for only a minor stability in frame pacing. However if you maxed all settings you would still see frame-rate in the stable 90-120 range. most often close to 120fps but would have more dips compared to settings lowered but not by any drastic degree so as to not even be worth it.

So lower settings netted you no real tangible difference.. This is because you are CPU limited and not GPU limited.. All you did by increasing graphic settings is increase GPU usage.. but your "performance wall" as they call it is really 90FPS. You can clearly only achieve 90fps across all situations max no matter the settings used. Now in the case of frame gen like LSFG this changes things. Since you're not going anywhere near your 'performance wall', you are in fact using less CPU by using it and more GPU but you will see an increase to CPU usage because when the GPU is used in any way so is the CPU.. since any call goes through the CPU first. But these are minor.

In my tests on my 13900HX RTX 4080 laptop.. as you can see in the example Gizzmoe linked, my CPU usage went right down because interpolated frames use FAR more GPU than CPU. My CPU usage went up a total of 6% vs 62% trying to run those frames natively. Lowering resolution had negligible difference. I could run 1080p to 6K DLDSR. It wouldn't matter that much because until my base 55fps becomes my 'performance wall' nothing changes.

In so many games 165fps was not achievable using proprietary methods like DLSS3 FG or FSR3 FG. LSFG was the only way to achieve 165fps target in 100% of all scenarios. I'm not a mathematician but using 24%(LSFG 2.1 55FPS Base(165fps)) CPU vs 78%(native 165fps but can't achieve.. frame rate capped at 147 in that particular game I was testing(Borderlands 3) means CPU is a LOT less used in the case of LSFG.

To sum things up the only way this uses more CPU to the degree that it threatens your targets is if base FPS = performance wall or worse. Otherwise using this will reduce usage on CPU not increase it vs native frame rate. This is a verifiable fact. Not sure why you're saying this will increase CPU usage vs native rate. It flat out will not.
Last edited by Xavvy; Jun 19, 2024 @ 6:04pm
Rain Sep 13, 2024 @ 7:47pm 
Man do I appreciate comments like Xavvy's. Taking that much time so people like me can understand better.
Goby Nov 21, 2024 @ 4:01am 
Yeah man, I login just to say thank you to Xavvy too. Stop spreading misinformation alumlovescake. Thanks Xavvy
DMZ Nov 28, 2024 @ 6:43am 
Originally posted by alumlovescake:
More because using a lower resolution is making you more cpu limited.
The CPU has to work harder _relatively_ to the GPU with lower res, but the overall workload still drastically decreases (hence why you get higher fps at lower res and settings).

So this is partially true, but xavvy provides the other 70% of the explanation: it is limiting in the sense that you force the GPU to do more work with the same level of CPU usage, thus ' shifting' the bottleneck away from the CPU to some extent. But it doesn't change anything on the CPU side aside from some additional processing needed for controlling LSFG,.
< >
Showing 1-6 of 6 comments
Per page: 1530 50

Date Posted: Jun 19, 2024 @ 6:44am
Posts: 6