Lossless Scaling

Lossless Scaling

What even is this black magic?
Seriously, i have spent the last couple of hours attempting to understand how if i run a game at 30 fps but then make the app double the frames i get 60 fps? Like i understand that you insert fake frames, but how does it actually make stuff faster??? Like do you know what i mean??? If the game runs at 30 fps you'd expect 30 fps game speed, but you can actually double your performance with this like ????????????????????

Also the resolution thing is just straight up black magic, this is actually amazing. I had a lot of trouble playing most games at 60 fps due to having a pretty old gpu but now all my problems are gone with this. Again... ???????????????
< >
Showing 1-15 of 21 comments
i dunno but my first impression of the tool is very bad. so bad that i deemed it unusable for the game im playing (planet crafter) lots of artifacts on the edges of screen and textures that have a fine grid like pattern like base building structures. so FG was a no go.

then i tried just the lossless scaling option but that too resulted in issues like inconsistent framepacing and microstuttering (with 40% GPU/CPU headroom) so i just went back to using exclusive fullscreen mode.

ill give it a benefit of a doubt since its a Unity game, which are usually running like crap to begin with

edit: looks like restarting my pc did fix the weird mouse lag i had in menu's somehow so i may have spoken too soon
Last edited by GrandTickler; Jan 10 @ 2:12pm
The thing is, frame generation doesn't boost performance, it just improves perceived fluidity.
60 native FPS is distinct from 30 FPS boosted to 60 even before you get into artifacts; the game logic is still running at whatever base framerate its running at, and input latency is still the same as it would be at 30 FPS.

The former point can be advantageous if the game doesn't allow for uncapping the framerate or if doing so breaks things, but the latter means it's still better to have 60 base FPS if you can.

Not to say I'm knocking enjoying frame gen; whatever floats your boat, just to encourage you to understand what it is.
Last edited by Space Detective; Jan 10 @ 2:48pm
Sora Jan 10 @ 2:56pm 
voodoo magic
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people. I hope they also make sure to tweak every game's INI file to completely disable LODs and occlusion culling and distant shadow settings, etc, etc, etc. You're not going to fall for Fake Performance and Fake Detail and Fake Shadows like you're some kind of fool, right?

Like... if they're differently tarded but they're at least consistent about it, fine, I even respect that. But these "oh, every other performance-improving trick we've been using for 20+ years is fair game but this new one is icky because a contrarian millionaire influencer told me so" people... I hate being reminded that half the people on earth are dumber than average every time I wade into a GPU discussion the last few years.

2+2=4
1+3=4
3+1=4.
Last edited by Polysorbate; Jan 10 @ 3:08pm
Sora Jan 10 @ 3:05pm 
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people. I hope they also make sure to tweak every game's INI file to completely disable LODs and occlusion culling and distant shadow settings, etc, etc, etc.

Like if they're differently tarded but they're at least consistent about it, fine, I even respect that. But these "oh, every other performance-improving trick is good but this new one is icky because an influencer told me so" people... I hate being reminded that half the people on earth are dumber than average every time I wade into a GPU discussion the last few years.

2+2=4
1+3=4
3+1=4.
damn chill bro xD
Gizzmoe Jan 10 @ 3:13pm 
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people.

He just posted a fact, get over it.
Originally posted by Polysorbate:
honestly really getting tired of these people. I hope they also make sure to tweak every game's INI file to completely disable LODs and occlusion culling and distant shadow settings, etc, etc, etc.
Maybe I do, if my system is sufficiently overkill for a game. I know I go out of my way to disable depth-of-field effects, because in most uses I hate it.

Also if you genuinely can't notice any latency difference between 30 and 60 FPS (or even lower than 30 FPS), then all I can say is some people would envy you.
Last edited by Space Detective; Jan 10 @ 3:21pm
kalirion Jan 10 @ 3:22pm 
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

Visual artifacts and increased input lag (more input lag than without framegen and FAR more input lag than a game would have if the "final" framerate was real native frames) keep it from being identical. This is the reason why everyone's making fun of NVIDIA claim that "the 5070 has the performance of a 4090".
Last edited by kalirion; Jan 10 @ 3:22pm
Christinson Jan 10 @ 9:51pm 
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people. I hope they also make sure to tweak every game's INI file to completely disable LODs and occlusion culling and distant shadow settings, etc, etc, etc. You're not going to fall for Fake Performance and Fake Detail and Fake Shadows like you're some kind of fool, right?

Like... if they're differently tarded but they're at least consistent about it, fine, I even respect that. But these "oh, every other performance-improving trick we've been using for 20+ years is fair game but this new one is icky because a contrarian millionaire influencer told me so" people... I hate being reminded that half the people on earth are dumber than average every time I wade into a GPU discussion the last few years.

2+2=4
1+3=4
3+1=4.
yes because framerate is more than just looks. the point of framerate is to have more information provided to you per second, so you have more chances to perceive/input each second.

fake frames dont give you more information, or any relevant information at all. it just makes it look better, and actually makes it feel worse because of input delay (although, the amount that this matters depends on fps, the kind of game youre playing, etc)

people need to know the difference because as an 'extreme' example, 120fps generated from 60 feels VERY different to play than native 120, or even native 60 for that matter in games that require semi-precise inputs. i like frame generation, but its not "free fps" because it feels awful to use (at least for me) in any action titles.

to sum it up, it's fine if you love or hate FG, but it's very important people know the difference, otherwise you'll have people expecting to get 240fps from 60 and for it to feel 100% perfect.
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people. I hope they also make sure to tweak every game's INI file to completely disable LODs and occlusion culling and distant shadow settings, etc, etc, etc. You're not going to fall for Fake Performance and Fake Detail and Fake Shadows like you're some kind of fool, right?

Like... if they're differently tarded but they're at least consistent about it, fine, I even respect that. But these "oh, every other performance-improving trick we've been using for 20+ years is fair game but this new one is icky because a contrarian millionaire influencer told me so" people... I hate being reminded that half the people on earth are dumber than average every time I wade into a GPU discussion the last few years.

2+2=4
1+3=4
3+1=4.
Having it look like a higher frame rate is good, but if your still at 30 fps its going to feel horrible no matter how much fake frames you input because of the horrible latency
Originally posted by Zorelnam:
Seriously, i have spent the last couple of hours attempting to understand how if i run a game at 30 fps but then make the app double the frames i get 60 fps? Like i understand that you insert fake frames, but how does it actually make stuff faster??? Like do you know what i mean??? If the game runs at 30 fps you'd expect 30 fps game speed, but you can actually double your performance with this like ????????????????????

Also the resolution thing is just straight up black magic, this is actually amazing. I had a lot of trouble playing most games at 60 fps due to having a pretty old gpu but now all my problems are gone with this. Again... ???????????????
As people said, this app doesn't improve performance like magic. It just GENERATE 1 frame in between to make it look smoother, by generating a single 2d image (Frame) between the real frames. So instead of making the game render the scene in 3d and project it to the screen, the GPU is told to process two image and make something in between (which is way lighter). The problem is, it's not real data/frame, the game itself still run at base fps (left number if you put FPS counter on). In fact, sometimes the base fps gets lowered cuz GPU need to progress the "Fake Frame" too, which lowers base fps from let say 50-60 to 40-50, which then gets a frame in the middle so it looks like 80-100 fps, but in reality your game is running at that 40-50 fps.

In a way, it adds some input lag since the base FPS gets lower, Sure, it looks nicer and some games don't mind too much input lag (especially x2 with LSFG 3.0 is not that bad), but it's not like it's black magic which just double ur fps.

Works on games that doesn't need quick and precise input, but poor on game that require quick and precise inputs.
Originally posted by GrandTickler:
i dunno but my first impression of the tool is very bad. so bad that i deemed it unusable for the game im playing (planet crafter) lots of artifacts on the edges of screen and textures that have a fine grid like pattern like base building structures. so FG was a no go.

then i tried just the lossless scaling option but that too resulted in issues like inconsistent framepacing and microstuttering (with 40% GPU/CPU headroom) so i just went back to using exclusive fullscreen mode.

ill give it a benefit of a doubt since its a Unity game, which are usually running like crap to begin with

edit: looks like restarting my pc did fix the weird mouse lag i had in menu's somehow so i may have spoken too soon

You are probably using it wrong.
Originally posted by Gone Guru In The Subaru:
Originally posted by GrandTickler:
i dunno but my first impression of the tool is very bad. so bad that i deemed it unusable for the game im playing (planet crafter) lots of artifacts on the edges of screen and textures that have a fine grid like pattern like base building structures. so FG was a no go.

then i tried just the lossless scaling option but that too resulted in issues like inconsistent framepacing and microstuttering (with 40% GPU/CPU headroom) so i just went back to using exclusive fullscreen mode.

ill give it a benefit of a doubt since its a Unity game, which are usually running like crap to begin with

edit: looks like restarting my pc did fix the weird mouse lag i had in menu's somehow so i may have spoken too soon

You are probably using it wrong.
how so?
Originally posted by Polysorbate:
"it doesn't improve performance, it just makes it look identical to if performance had been improved and that's totally different"

honestly really getting tired of these people.

If your framerate is already decent and you use frame gen to make it higher, I agree with you that it works. But you aren't improving latency, so if your framerate is too low you aren't going to make the lag from input to output go away - actually you'll slightly make it worse. That affects different people differently, and I'd argue that latency is only important in some games. Still, to say that it ALWAYS makes a game have "better performance" is wrong.
CrowRising Jan 11 @ 11:31pm 
Personally I've found the best use case for framegen is to use it in games in which the physics are tied to the framerate. For example, games designed to run at 60 fps and experience severe physics related issues when you go above this. You can achieve the appearance of smooth 120 fps without borking up the physics and causing issues. It might introduce some input delay but at least on my current specs this generally seems to be pretty negligible and I've not encountered a situation in which it actually matters, even when it comes to things like quick time events.

However, another use case I've found for it is there are some games in which physics aren't an issue but I still don't quite have the specs to reach true 120 fps. Introducing a 60 fps frame lock and then going up to 120 through framegen often works just fine, although it depends on how demanding the game is and what settings are being used and etc. LSFG3 has broadened the number of games this works fine for thanks to the optimized performance compared to LSFG2, so at least there's that.

The final practical use case I've found for it is to boost up the apparent fps of games that don't have an option for higher framerates. A lot of older games especially don't have options to go higher than 60 fps, so I can use framegen to double this up to 120. The original Sonic Generations for example works quite nicely with this. Running at 1440p at 60fps with framegen up to 120 almost feels like a remastered version in and of itself without any need for the SXSG remaster.

But yes, it's true that frame generation doesn't actually function like true framerate. It can still sometimes feel like you're getting more optimized input latency even when you really aren't because of the placebo of seeing smooth video on screen, in reality you're not getting advantaged in any way whatsoever. It's just about getting the visual appearance of higher framerate. How much benefit you're going to get out of it depends on how much you care about the visuals appearing smooth, your hardware power, and how high of a framerate your monitor supports. If you want the full benefit of high fps in regards to input latency, you're going to have to achieve those framerates naturally and not fake it with framegen.
< >
Showing 1-15 of 21 comments
Per page: 1530 50

Date Posted: Jan 10 @ 1:03pm
Posts: 21