Lossless Scaling

Lossless Scaling

FN Jan 12 @ 3:19am
Use APU for Frame Gen?
I have an APU+GPU setup with ryzen 5600g and rtx 3050 8GB,
Currently multi monitor mode is disabled in the bios which disables the apu for the meantime. I wonder if others here have offloaded framegen through their APU's and how is it so far? Thanks
Last edited by FN; Jan 12 @ 3:20am
< >
Showing 16-29 of 29 comments
FN Jan 14 @ 3:32am 
I tested it out I was able to do 40 fps to 120 fps (3x) in skyrim, it was stable but the issue now is I am not able to record using obs with this method:

THIS is how to RECORD Lossless Scaling Frame generation with OBS!!

I am not able to record using my dGPU unless I set OBS to use the APU in the windows advanced graphics preferences. So the caveat is I have to make OBS utilize my APU while also using it for framegen. In simple analogy below:

APU (framegen) + APU (OBS) = can record interpolated frames
APU (framegen) + dGPU (OBS) = can't record (black screen)
dGPU (framegen) + APU (OBS) = can record interpolated frames

Guess I'll have to stick to using my dGPU for now, hoping there is a workaround for this in the near future.
Last edited by FN; Jan 14 @ 4:04am
Spook Jan 14 @ 3:38am 
Originally posted by FN:
I tested it out I was able to do 40 fps to 120 fps (3x) in skyrim, it was stable but [...]
Awesome! What resolution are you feeding into LSFG, and what resolution/framerate(120?) are you outputting to your monitor? Also, how are you connected to your monitor? Via dGPU or iGPU?

No suggestions on OBS, sorry.
Last edited by Spook; Jan 14 @ 3:39am
FN Jan 14 @ 3:41am 
Originally posted by Spook:
Originally posted by FN:
I tested it out I was able to do 40 fps to 120 fps (3x) in skyrim, it was stable but [...]
Awesome! What resolution are you feeding into LSFG, and what resolution/framerate(120?) are you outputting to your monitor? Also, how are you connected to your monitor? Via dGPU or iGPU?

No suggestions on OBS, sorry.
1080p output, I capped the game at 40 fps and all this outputting on my monitor connected to the dGPU via HDMI. Single monitor setup.
Spook Jan 14 @ 3:48am 
Originally posted by FN:
1080p output, I capped the game at 40 fps and all this outputting on my monitor connected to the dGPU via HDMI. Single monitor setup.
Interesting, how is the latency compared to running game+LS on the DGPU? Since you're now shuffling data to and from the iGPU.

Would you mind testing with your monitor connected to your MOBO? This would change a variable for the OBS testing as well.
Last edited by Spook; Jan 14 @ 3:50am
FN Jan 14 @ 3:52am 
Originally posted by Spook:
Originally posted by FN:
1080p output, I capped the game at 40 fps and all this outputting on my monitor connected to the dGPU via HDMI. Single monitor setup.
Interesting, how is the latency compared to doing it all on the DGPU? Since you're now shuffling data to and from the iGPU.

Would you mind testing with your monitor connected to your MOBO? This would change a variable for the OBS testing as well.
Latency wise, It was feeling the same as using dGPU as framegen, sorry I didn't gather detailed data on this, OSD statistics in skyrim conflicts with ENB binaries.

Regarding connecting to MOBO, wouldn't this prevent me from using my dedicated GPU on games/apps?

Based on google/reddit, plugging to the MOBO will not utilize the dGPU.
Last edited by FN; Jan 14 @ 3:57am
Spook Jan 14 @ 3:56am 
Originally posted by FN:
Latency wise, It was feeling the same as using dGPU as framegen, sorry I didn't gather detailed data on this [...]
Interesting, and no problem.

Regarding connecting to MOBO, wouldn't this prevent me from using my dedicated GPU on games/apps?
Very unlikely, your dGPU's frames would just be routed through you iGPU. Which in the past could introduce latency and a possible throughput bottleneck.

But with your dGPU>iGPU>dGPU-setup working, i feel quite confident.
FN Jan 14 @ 4:01am 
Originally posted by Spook:
Originally posted by FN:
Latency wise, It was feeling the same as using dGPU as framegen, sorry I didn't gather detailed data on this [...]
Interesting, and no problem.

Regarding connecting to MOBO, wouldn't this prevent me from using my dedicated GPU on games/apps?
Very unlikely, your dGPU's frames would just be routed through you iGPU. Which in the past could introduce latency and a possible throughput bottleneck.

But with your dGPU>iGPU>dGPU-setup working, i feel quite confident.
Yeah I think this also requires special methods or application to route dGPU output to the APU which will likely increase latency and more power draw.
Spook Jan 14 @ 4:09am 
Originally posted by FN:
Yeah I think this also requires special methods or application to route dGPU output to the APU which will likely increase latency and more power draw.
In theory it should reduce load on your PC (PCI-subsystem) by having to only move your frame once (dGPU>iGPU), instead of twice. The setup you had working should have caused a worse performance impact.
FN Jan 14 @ 4:38am 
I hope there is a reliable way of recording interpolated frames using obs
Last edited by FN; Jan 14 @ 4:38am
FN Jan 14 @ 8:51pm 
Originally posted by Spook:
Originally posted by FN:
1080p output, I capped the game at 40 fps and all this outputting on my monitor connected to the dGPU via HDMI. Single monitor setup.
Interesting, how is the latency compared to running game+LS on the DGPU? Since you're now shuffling data to and from the iGPU.

Would you mind testing with your monitor connected to your MOBO? This would change a variable for the OBS testing as well.
I just tested this, I connected my HDMI to the MOBO and it worked fine (dGPU>APU), it was stable and smooth compared to (dGPU>APU>dGPU).

The previous issue that I had can also be solved by ticking a box in OBS to record
with SLI/Crossfire mode but its slow and causes fps drops while recording.
Last edited by FN; Jan 14 @ 9:18pm
Spook Jan 15 @ 1:04am 
Originally posted by FN:
Originally posted by Spook:
Interesting, how is [...] testing as well.
I just tested this, I connected my HDMI to the MOBO and it worked fine (dGPU>APU), it was stable and smooth compared to (dGPU>APU>dGPU).

The previous issue that I had can also be solved by ticking a box in OBS to record
with SLI/Crossfire mode but its slow and causes fps drops while recording.
Would you say HDMI>MOBO was more stable and smooth than HDMI>dGPU? Or no difference?

Thank you for reporting your findings btw.
FN Jan 15 @ 2:50am 
Originally posted by Spook:
Originally posted by FN:
I just tested this, I connected my HDMI to the MOBO and it worked fine (dGPU>APU), it was stable and smooth compared to (dGPU>APU>dGPU).

The previous issue that I had can also be solved by ticking a box in OBS to record
with SLI/Crossfire mode but its slow and causes fps drops while recording.
Would you say HDMI>MOBO was more stable and smooth than HDMI>dGPU? Or no difference?

Thank you for reporting your findings btw.
Yes, when recording my fps stutters alot when my primary display is dGPU, also could be with OBS's implementation of SLI/Crossfire mode but I will test recording again with dGPU as my display.
With some testings i found out that LS3 Frame Gen can have some good results with Igpu+Dgpu setup, Ryzen 5700g and rtx 4060 in my case.

My PC have 2 display, one 4k 120hz tv and a 1080p 120hz monitor. First the output display NEEDS to be connected through the igpu if not the frames will be really weird, LS say it output "60/40"(i cap in the Nvidia drivers 60 fps max as it output 120 with frame gen).

The real unfortunate thing is that the Igpu is weak to generate 4k 120hz using LS3, even with the low setting slidebar on LS app. The Igpu on idle is around 20-26% when display 4k 120hz and basically goes almost to 100% when using LS3 frame gen, so it have problems to push all the frames out.

NOT ALL IS BAD, that was 4k 120hz but on my other 1080p monitor and i belive even 1440p monitor would be beneficial since when is output on this display, the igpu frame gen work is not nearly as heavy when outputing 4k 120hz, the cost is really low and works perfectly.

For gpus weaker than rtx 4060 that is amazing, but the 4060 is already ease to reach in most games 1080p 120hz. For me i really wish i could 4k 120hz in games with high settings combining DLSS performance(native Nvidia frame gen still not as avaliable like DLSS 2) + LS3 frame gen, but unfortunally the LS3 2X frame gen for 4k 120hz is too much for Ryzen 5700g igpu stock.
Last edited by XPvsmanX; Feb 6 @ 5:43am
Originally posted by Rogal-117:
Really not worth it unless you're already rocking a dual GPU setup to begin with (ex. for Rendering etc.)

Tested it myself, and i kinda got it working, but i noticed that you really need a decent entry/mid range GPU at a minimum to even get a decent enough performance out of it. A fully Saturated PCI-E Lanes also has some effect on performance on both GPUs.

Test System i tried it on :
CPU - Ryzen 9 5900x (Has 24 PCIe Lanes)
Main GPU - RTX 3060, RX6600 & RX6800
Sub GPU (used only for Lossless Scaling) - GTX950, GTX1060 3gb, RX580 8gb, Quadro P400
Ram - 32gb Ram

combination i tried :
* RTX3060 + GTX950 - Works but has terrible LSFG performance, will sometimes get a DX12 error due to the GTX 950 not supporting DX12.
* RTX 3060 + GTX1060 3gb - Works way better than the GTX950, no dx12 errors but still has some random performance drops here and there when used on more demanding games/settings compared to just using LSFG on the main GPU.
* RTX 3060 + RX580 - Works, only slightly better result than with the GTX10603g combo.

* RX6600 + GTX950 - Same result as with the RTX3060 + GTX950 test.
* RX6600 + GTX1060 - Same result as with the RTX3060 + GTX1060 3gb test.
* RX6600 + RX580 - Same result as with the RTX3060 + RX580 test.

* RX6800 + R6600 - Serviceable, but its just too much of a waste to only use an RX6600 as a sub GPU just to render fake frame in Lossless.
* RX6800 + RX580 - Works same as with the 1060, but is Most Definitely not worth the ridiculous extra Heat + overall power draw from your system (RX6800 is 200-250watts + RX580 8gb whic is 195 watts).
* RX6800 + GTX 10603gb - Same result as the RTX 3060 + GTX1060 3gb combo in terms of Lossless Performance.
* RX6800 + GTX950 - Same result as with the RTX3060 + GTX950 test.

Finaly, tested a quadro P400 just for the hell of it, and just to see if it was even possible :
* RX6800 + Quadro P400 - Lossless Runs like ♥♥♥♥, only useful for running apps that req. cuda for running/rendering.
* GTX 3060 + Quadro P400 - Also runs like ♥♥♥♥, better to just only use the main GPU for Lossless.

Overall, i find that it just was not worth the extra Heat and total system power draw from putting a second dGPU on your system dedicated for use just for use in Lossless Scaling.

Which resolution and monitor frequency did you use?
< >
Showing 16-29 of 29 comments
Per page: 1530 50

Date Posted: Jan 12 @ 3:19am
Posts: 29