Lossless Scaling

Lossless Scaling

 This topic has been pinned, so it's probably important
Dual-GPU Guide/Overview
By IvanVladimir04
https://www.youtube.com/watch?v=gH359ZNxvNk

Or check out the written guide on the LS Discord:
https://discord.com/channels/1042475930217631784/1323391346039455756/1323391346039455756

And the "Secondary GPU Max LSFG Capability Chart", the numbers you see there are the max possible total fps with x2 framegen and 100% resolution scale/flow scale.
https://docs.google.com/spreadsheets/d/17MIWgCOcvIbezflIzTVX0yfMiPA_nQtHroeXB1eXEfI/edit?gid=1980287470#gid=1980287470
Last edited by Gizzmoe; Mar 8 @ 4:35am
< >
Showing 16-30 of 33 comments
Originally posted by 人称Flame:
Originally posted by Moontrap:
Will this work with external GPU (Nvidia for example) and GPU that is built-in into CPU? That way we can use extra GPU for extra frame generation.

edit: it reminds me (just a bit) of old technology from Lucid Virtu MVP. There was a mode called Virtual V-Sync & HyperFormance - it claimed to reduce input lag and improve frame pacing by combining GPU workloads (though results were mixed). The technology is now abandoned and some of solutions are being used in Nvidia Optimus (switching GPU's to save power when needed - but not combining the two different GPU's power)

yes but the iGPU has to be powerful enough to run LSFG, something minimally in the ballpark of iris xe or 680M
ok...i have a question to, i have 2 cards one is 4090 other is 2080 ti, now im still waiting for extra cables and new PSU so i can try to run them, but in the meantime i just want to ask what is better, to use 4090 for frame generation or 2080 ti, or what configuration would you use. Tnx for the answer.
Gizzmoe Apr 2 @ 10:38am 
Originally posted by Λ-Amko-Λ:
what is better, to use 4090 for frame generation or 2080 ti

The 2080 for frame generation.
•u• Apr 4 @ 5:33pm 
I have a 7900xtx, what would be the cheapest good gpu to pair with it, to get me to a consistant 4k 120 fps? Currently eyeballing the RX 6500 XT

Rest of my build just so you know the limits.
1000w seasonic fully modular psu
32gigs of dualchannel ddr5 5200's
Ryzen 9 7950x
Last edited by •u•; Apr 4 @ 5:34pm
Hi, i followed that guide but it somehow doesnt seem to work?

Im using a GTX 970 in a x16 pcie 3.0 slot as my main GPU
and a R9 270 2GB as frame generation GPU in a x4 pcie 3.0 slot.

somehow when i use frame generation, i actually lose framerate (when i use the GTX 970 for frame generation)
when i chose the r9 270 for frame generation, i dont lose frames but the picture still appears laggy and choppy.

both options make the picture worse then without. (im playing Assetto Corsa.. i have a 60fps frame cap with vsync. I sometimes dip into the low 50ths..)

thats why i wanted to use frame gen.. i was running x1.5 and x2.0 to see, if that would balance out the dips into the 50ths, to give me a rock solid 60fps all the time.

i also tried to put my VGA adapter into the R9 270, but then i somehow cant use my GTX 970 as my main GPU for gaming. I followed the registry guide but i still cant chose the 970 as my main GPU somehow.
Originally posted by maxofprogress:
Hi, i followed that guide but it somehow doesnt seem to work?

Im using a GTX 970 in a x16 pcie 3.0 slot as my main GPU
and a R9 270 2GB as frame generation GPU in a x4 pcie 3.0 slot.

somehow when i use frame generation, i actually lose framerate (when i use the GTX 970 for frame generation)
when i chose the r9 270 for frame generation, i dont lose frames but the picture still appears laggy and choppy.

both options make the picture worse then without. (im playing Assetto Corsa.. i have a 60fps frame cap with vsync. I sometimes dip into the low 50ths..)

thats why i wanted to use frame gen.. i was running x1.5 and x2.0 to see, if that would balance out the dips into the 50ths, to give me a rock solid 60fps all the time.

i also tried to put my VGA adapter into the R9 270, but then i somehow cant use my GTX 970 as my main GPU for gaming. I followed the registry guide but i still cant chose the 970 as my main GPU somehow.

R9 270 is too weak
Jae Li Apr 5 @ 3:06pm 
Ok, so if I want to run 165FPS on 1440p ultrawide do the primary or secondary GPU need to have 16GB of ram?

Which GPU should have more RAM the primary or secondary? Can I use a 3060 as a primary and an Rx 6500 as the secondary for upscaling?
Moontrap Apr 5 @ 11:58pm 
Originally posted by 人称Flame:
Originally posted by maxofprogress:
Hi, i followed that guide but it somehow doesnt seem to work?

Im using a GTX 970 in a x16 pcie 3.0 slot as my main GPU
and a R9 270 2GB as frame generation GPU in a x4 pcie 3.0 slot.

somehow when i use frame generation, i actually lose framerate (when i use the GTX 970 for frame generation)
when i chose the r9 270 for frame generation, i dont lose frames but the picture still appears laggy and choppy.

both options make the picture worse then without. (im playing Assetto Corsa.. i have a 60fps frame cap with vsync. I sometimes dip into the low 50ths..)

thats why i wanted to use frame gen.. i was running x1.5 and x2.0 to see, if that would balance out the dips into the 50ths, to give me a rock solid 60fps all the time.

i also tried to put my VGA adapter into the R9 270, but then i somehow cant use my GTX 970 as my main GPU for gaming. I followed the registry guide but i still cant chose the 970 as my main GPU somehow.

R9 270 is too weak

R9 270 is too weak but Radeon 680M built-in into CPU is enough? Connecting to my previous question - would it work having 780M from CPU + RTX 3090? (sorry if this is stupid question but i'm very interested in this topic)
Last edited by Moontrap; Apr 5 @ 11:59pm
Are there any problems or hickups I should be aware of? I'm considering purchasing a 6700xt to pair with my 9070xt to hit 160+ fps on a 5k2k monitor. Does this affect my other two monitors I have?
Originally posted by •u•:
I have a 7900xtx, what would be the cheapest good gpu to pair with it, to get me to a consistant 4k 120 fps? Currently eyeballing the RX 6500 XT

Rest of my build just so you know the limits.
1000w seasonic fully modular psu
32gigs of dualchannel ddr5 5200's
Ryzen 9 7950x

Trying to figure this out. I'm targetting 160 at 5k2k. Very scarce data available. Worried about buying a second card and it just not working how I expect it to
arb0 Apr 7 @ 3:36pm 
Originally posted by Aggressivepillow:
Originally posted by •u•:
I have a 7900xtx, what would be the cheapest good gpu to pair with it, to get me to a consistant 4k 120 fps? Currently eyeballing the RX 6500 XT

Rest of my build just so you know the limits.
1000w seasonic fully modular psu
32gigs of dualchannel ddr5 5200's
Ryzen 9 7950x

Trying to figure this out. I'm targetting 160 at 5k2k. Very scarce data available. Worried about buying a second card and it just not working how I expect it to

I hesitated (like actual angst) for a bit on integrating the second gpu because it's seemingly easy to overlook important details.

1) Look up your motherboard manual and double check the PCIE lanes are fast enough when running dual devices, some mobos will have their fastest lane's interface speeds change from 16x to 8x if there's a second device is connected in a different lane. The second lane/gpu will need to be rated 4x bare minimum, 2x? No dice.

2) Verify your power supply can supply the necessary amount of power to run both, with a little overhead.

3) Make sure the space inside your case can accommodate the second gpu.

4) Go over this guide, https://steamcommunity.com/sharedfiles/filedetails/?id=3347817209, I followed it and came out ok.

Those are some of the showstoppers that can be easily avoided.
Last edited by arb0; Apr 7 @ 3:40pm
Originally posted by arb0:
Originally posted by Aggressivepillow:

Trying to figure this out. I'm targetting 160 at 5k2k. Very scarce data available. Worried about buying a second card and it just not working how I expect it to

I hesitated (like actual angst) for a bit on integrating the second gpu because it's seemingly easy to overlook important details.

1) Look up your motherboard manual and double check the PCIE lanes are fast enough when running dual devices, some mobos will have their fastest lane's interface speeds change from 16x to 8x if there's a second device is connected in a different lane. The second lane/gpu will need to be rated 4x bare minimum, 2x? No dice.

2) Verify your power supply can supply the necessary amount of power to run both, with a little overhead.

3) Make sure the space inside your case can accommodate the second gpu.

4) Go over this guide, https://steamcommunity.com/sharedfiles/filedetails/?id=3347817209, I followed it and came out ok.

Those are some of the showstoppers that can be easily avoided.

Appreciate it! Ended going with a 6700xt for my frame gen card. Fingers crossed
Mechanical Apr 11 @ 10:17am 
Originally posted by Attis:
I tried it with two 3080s and it works. However, the latency increases drastically if the GPU 2, which calculates the real images, also increases the latency above 90% load. However, it is clear that you can gain 10-20 FPS real images with it. But the high load on the GPU still gives the high latency, this is not the fault of LSFG but in general.

I was thinking about this exactly for the last few hours so I tried simulating the outcomes with my single gpu setup.

If I turn on DLSS with Frame Gen in MHWilds my Latency goes through the roof. I cant even see the real number anymore, fact seems to be that DLSS is not optimal for Dual Frame Gen.

If I have an NVIDIA Card using FSR(god forbid) my latency does the exact opposite, it drops like crazy.

Doesnt look as crisp as with DLSS but still the result is very good, plus a very low latency.

Now combine that low latency - good rasterization with the standard Frame Generation and you get a stable picture that has been made by your GPU on low resources, since upscaling reduces the real render resolution.

Now apply the same results to a setup that uses a second gpu on the same driver pipeline.
just to interpolate the already interpolated frames in the same flow of information, lossless, latency less, latency stays linear, GPUs can work more efficiently, both GPUs can stay on medium power draw and still have an extremy refreshing result, even on some used or older hardware.
Wich basically happens here because Lossless allows it even tho Big N and AMD probably dont want that to be the case in the future.

Edit:
probably memory lanes on your MoBo or the MoBo in general are the real Key component to performance here since PCIe Lanes are shared?
Last edited by Mechanical; Apr 11 @ 10:18am
Mechanical Apr 11 @ 10:28am 
Originally posted by maxofprogress:
Hi, i followed that guide but it somehow doesnt seem to work?

Im using a GTX 970 in a x16 pcie 3.0 slot as my main GPU
and a R9 270 2GB as frame generation GPU in a x4 pcie 3.0 slot.

somehow when i use frame generation, i actually lose framerate (when i use the GTX 970 for frame generation)
when i chose the r9 270 for frame generation, i dont lose frames but the picture still appears laggy and choppy.

both options make the picture worse then without. (im playing Assetto Corsa.. i have a 60fps frame cap with vsync. I sometimes dip into the low 50ths..)

thats why i wanted to use frame gen.. i was running x1.5 and x2.0 to see, if that would balance out the dips into the 50ths, to give me a rock solid 60fps all the time.

i also tried to put my VGA adapter into the R9 270, but then i somehow cant use my GTX 970 as my main GPU for gaming. I followed the registry guide but i still cant chose the 970 as my main GPU somehow.

I also used the same GPU as you around a Year or two. And let me tell you these cards have not aged well. So maybe dont have too high hopes for these cards. Let one Card do one Job only especially if they are older.
I've noticed Radeons tend to have higher FP16 performance. I've also noticed I don't need to run the display off the dedicated frame gen card. I've tested this repeatedly. The best performance is actually coming from the monitor on the render card and selecting the second frame gen card in the Lossless scaling app. This seems to go against everything I've seen about dual gpu setups for frame gen. I've tested with only one gpu in the system, up to 3 gpus. The best performance is by far the render card to the monitor and the frame gen card selected in the app. If this wasn't true then the performance should be identical with only the one card which it is not. The second card is undoubtedly handling the FP16 workload.
Originally posted by ᑎOIᔕE:
I've noticed Radeons tend to have higher FP16 performance. I've also noticed I don't need to run the display off the dedicated frame gen card. I've tested this repeatedly. The best performance is actually coming from the monitor on the render card and selecting the second frame gen card in the Lossless scaling app. This seems to go against everything I've seen about dual gpu setups for frame gen. I've tested with only one gpu in the system, up to 3 gpus. The best performance is by far the render card to the monitor and the frame gen card selected in the app. If this wasn't true then the performance should be identical with only the one card which it is not. The second card is undoubtedly handling the FP16 workload.

It could be your specific GPU combo/setup. For one I can't even run LSFG to my 2nd GPU when the monitor is connected to the main GPU; for two, I've heard input lag is much worse when you connect monitor to the main GPU because that adds an extra trip between two GPUs (Main GPU render frames- send them to 2nd GPU- 2nd GPU generate frames- send them back to main GPU to display) vs when monitor is connected to 2nd GPU (main GPU render frames- send them to 2nd GPU- 2nd GPU generate frames and display)
< >
Showing 16-30 of 33 comments
Per page: 1530 50