Lossless Scaling

Lossless Scaling

For those who use dual setup please help me :(
I already posted this on reddit, but posting here too for more help :(

This post is for users familiar with dual GPU setups where LSFG runs on a separate GPU from the main render GPU. This configuration should maintain your baseline FPS when LSFG is enabled, allowing LSFG to multiply that number instead of taking a performance hit on a single GPU. For example, at 4K resolution, you should maintain 60 FPS after enabling LSFG and achieve 2x/3x/4x multiplication, versus dropping to 45 FPS on a single GPU setup.
My current setup includes:
4070 Ti paired with Intel Arc A750
AMD 7800X3D CPU
ASUS ROG Strix B650E-F motherboard (supports 4x16 and 4x4 when both GPU slots are populated)

I followed Ravenger's Discord guide and this Steam guide (which are essentially the same): https://steamcommunity.com/sharedfiles/filedetails/?id=3347817209

I've configured everything according to the guides: selected my render GPU in Windows 11 graphics settings and NVIDIA OpenGL, connected my display to the LSFG GPU, and selected the LSFG GPU in the "Preferred GPU" LSFG settings. However, I'm experiencing issues.

Currently, when I have 140 FPS and enable LSFG, it drops to 80/80.

Another weird thing is before enabling LSFG, my render GPU usage is around 99% with LSFG GPU at 80%. After enabling LSFG, the render GPU drops to 48% while the LSFG GPU increases to 99%. This seems incorrect, as the render GPU should maintain 99% usage since it's still rendering the game.

I've tried several troubleshooting steps:
1, Disabled all VRR
2. Tested different configurations (e.g., connecting display to render GPU and running LSFG on A750)
3. Updated both NVIDIA and Intel drivers (its a fresh install anyways)
4. Performed a fresh Windows 11 installation
5. Monitored temperatures (both GPUs remain below 75°C)

One potential issue might be power delivery: the A750 requires two 8-pin GPU cables, but I'm currently using a single pigtail dual 6+2 GPU cable. Would using separate 8-pin cables help?
Another potential issue is maybe I need to update my drivers from my mobo vendor since its a fresh install, I didn't get any chipset drivers or anything like that but I highly doubt thats the cause, as thats more CPU related and my CPU is running perfect temps and score on R23

My last resort is testing with a different GPU (possibly a 4070 from my family's PC) to determine if the A750 is bottlenecking the setup.

I'm looking for suggestions, particularly from users who've encountered similar issues with dual setups or those familiar with this configuration. (I understand this is a relatively uncommon setup.)

Pic of my settings: https://imgur.com/a/wu7UQeD
Originally posted by Gustav:
Run Lossless scaling renderer/interpolation on the same GPU as the program you're using - using several GPUs is going to create latencies and stuttering. The reason for this is that all frames rendered by the GPU that's running the game must be sent to another GPU for scaling/interpolation only to be sent back again so that the GPU rendering the game can send the frames to the monitor. - it adds latency and unnecessary workload Alternatively, you could render the frames from Lossless Scaling on the same GPU that the monitor is connected to, but it will still create latencies and most likely stuttering.
< >
Showing 1-8 of 8 comments
The author of this thread has indicated that this post answers the original topic.
Gustav Jan 9 @ 11:51am 
Run Lossless scaling renderer/interpolation on the same GPU as the program you're using - using several GPUs is going to create latencies and stuttering. The reason for this is that all frames rendered by the GPU that's running the game must be sent to another GPU for scaling/interpolation only to be sent back again so that the GPU rendering the game can send the frames to the monitor. - it adds latency and unnecessary workload Alternatively, you could render the frames from Lossless Scaling on the same GPU that the monitor is connected to, but it will still create latencies and most likely stuttering.
Gizzmoe Jan 9 @ 11:53am 
Originally posted by HabibiFresh:
Currently, when I have 140 FPS and enable LSFG, it drops to 80/80.

Make sure you haven't set a 80fps cap on LS somewhere.

Another weird thing is before enabling LSFG, my render GPU usage is around 99% with LSFG GPU at 80%. After enabling LSFG, the render GPU drops to 48% while the LSFG GPU increases to 99%. This seems incorrect, as the render GPU should maintain 99% usage since it's still rendering the game.

The render GPU is bottlenecked by the LS GPU, which is running at 99%.
Originally posted by Gustav:
Run Lossless scaling renderer/interpolation on the same GPU as the program you're using - using several GPUs is going to create latencies and stuttering. The reason for this is that all frames rendered by the GPU that's running the game must be sent to another GPU for scaling/interpolation only to be sent back again so that the GPU rendering the game can send the frames to the monitor. - it adds latency and unnecessary workload Alternatively, you could render the frames from Lossless Scaling on the same GPU that the monitor is connected to, but it will still create latencies and most likely stuttering.

Yeah just swapped to an RTX 4070 to run LSFG on instead of the a750 and now finally got it work 70/140 but it felt really bad, latency wise, but more so stuttering as you mentioned, frame pacing felt horrible compared to running all on 1 GPU
Originally posted by Gizzmoe:
Originally posted by HabibiFresh:
Currently, when I have 140 FPS and enable LSFG, it drops to 80/80.

Make sure you haven't set a 80fps cap on LS somewhere.

Another weird thing is before enabling LSFG, my render GPU usage is around 99% with LSFG GPU at 80%. After enabling LSFG, the render GPU drops to 48% while the LSFG GPU increases to 99%. This seems incorrect, as the render GPU should maintain 99% usage since it's still rendering the game.

The render GPU is bottlenecked by the LS GPU, which is running at 99%.

Yeah no lsfg cap I promise!

I just swapped the a750 for an rtx 4070 and now lsfg "works" maybe? My baseline FPS was 120 with 4070ti rendering and output to 4070, and after running lsfg on the 4070, it became 70/140 so atleast it works now instead of 80/80 but experience felt much more horrible compared to single card tbh. Latency is bad, frame pacing and stutters out the wazoo, I will just go back to single card, it was a fun try but pointless
Works for me using a 7900xtx and 3070ti. My latency isn't bad at all either. I use x4 frame gen in bo6 and im commonly a top 3 player on my team. Maybe try out the other capture APIs, also make sure you render your game on your secondary monitor. Your game will display on both monitors but with lossless scaling on your main monitor. Before i got used to it i'd just shut off my second monitor while using lossless. Im thinking of buying a small 1440p screen to render my games to instead so its not such a distraction. Probably repeating what others have said but hope it helps
Last edited by AmericanMadeMan; Jan 10 @ 1:21am
Luca9519 Jan 10 @ 4:02am 
can I use my gtx 970 with my 3080ti?
Originally posted by Gustav:
Run Lossless scaling renderer/interpolation on the same GPU as the program you're using - using several GPUs is going to create latencies and stuttering. The reason for this is that all frames rendered by the GPU that's running the game must be sent to another GPU for scaling/interpolation only to be sent back again so that the GPU rendering the game can send the frames to the monitor. - it adds latency and unnecessary workload Alternatively, you could render the frames from Lossless Scaling on the same GPU that the monitor is connected to, but it will still create latencies and most likely stuttering.
Dude, did you even read the guide he linked? you have no idea what you said
Originally posted by HabibiFresh:
I already posted this on reddit, but posting here too for more help :(

This post is for users familiar with dual GPU setups where LSFG runs on a separate GPU from the main render GPU. This configuration should maintain your baseline FPS when LSFG is enabled, allowing LSFG to multiply that number instead of taking a performance hit on a single GPU. For example, at 4K resolution, you should maintain 60 FPS after enabling LSFG and achieve 2x/3x/4x multiplication, versus dropping to 45 FPS on a single GPU setup.
My current setup includes:
4070 Ti paired with Intel Arc A750
AMD 7800X3D CPU
ASUS ROG Strix B650E-F motherboard (supports 4x16 and 4x4 when both GPU slots are populated)

I followed Ravenger's Discord guide and this Steam guide (which are essentially the same): https://steamcommunity.com/sharedfiles/filedetails/?id=3347817209

I've configured everything according to the guides: selected my render GPU in Windows 11 graphics settings and NVIDIA OpenGL, connected my display to the LSFG GPU, and selected the LSFG GPU in the "Preferred GPU" LSFG settings. However, I'm experiencing issues.

Currently, when I have 140 FPS and enable LSFG, it drops to 80/80.

Another weird thing is before enabling LSFG, my render GPU usage is around 99% with LSFG GPU at 80%. After enabling LSFG, the render GPU drops to 48% while the LSFG GPU increases to 99%. This seems incorrect, as the render GPU should maintain 99% usage since it's still rendering the game.

I've tried several troubleshooting steps:
1, Disabled all VRR
2. Tested different configurations (e.g., connecting display to render GPU and running LSFG on A750)
3. Updated both NVIDIA and Intel drivers (its a fresh install anyways)
4. Performed a fresh Windows 11 installation
5. Monitored temperatures (both GPUs remain below 75°C)

One potential issue might be power delivery: the A750 requires two 8-pin GPU cables, but I'm currently using a single pigtail dual 6+2 GPU cable. Would using separate 8-pin cables help?
Another potential issue is maybe I need to update my drivers from my mobo vendor since its a fresh install, I didn't get any chipset drivers or anything like that but I highly doubt thats the cause, as thats more CPU related and my CPU is running perfect temps and score on R23

My last resort is testing with a different GPU (possibly a 4070 from my family's PC) to determine if the A750 is bottlenecking the setup.

I'm looking for suggestions, particularly from users who've encountered similar issues with dual setups or those familiar with this configuration. (I understand this is a relatively uncommon setup.)

Pic of my settings: https://imgur.com/a/wu7UQeD
This seems to be an issue with the drivers of your A750, did you try running the A750 on the main slot? that's how Ravenger got his B570 to work
< >
Showing 1-8 of 8 comments
Per page: 1530 50

Date Posted: Jan 9 @ 7:35am
Posts: 8