Lossless Scaling

Lossless Scaling

Anyone else tried using with 2 gpu's?
I have 5700xt and since the base fps hit is substantial when using fg i decided to try offloading it to rx590 i had laying around.

It works but the fps hit is actually worse when fg is offloaded to 590 compared to doing everything on 5700xt.

Was pretty unexpected result imo.
Last edited by Kapteeni Moukku; Aug 15, 2024 @ 8:14am
< >
Showing 1-15 of 20 comments
Spook Aug 15, 2024 @ 8:30am 
There are several reports of people getting this to work. Though i've experienced the same result you have, on both a 6900xt/5500xt combo and a 3070ti/4060 combo. The slave-cards on both combos iv'e tested to be able to FG independently, but no luck in combo. Most interesting thing i noticed when testing was an increase in bus load. I will be trying this again in the future on a pcie 5.0/6.0 board with dual x8 bifurcated slots. What kind of MOBO and CPU are you running?

*I tried this on a 5700g/b550 for the 6900xt/5500xt and on a 5800x3d/x570 for the 3070ti/4060.
Last edited by Spook; Aug 15, 2024 @ 8:34am
Kapteeni Moukku Aug 15, 2024 @ 9:00am 
Originally posted by Spook:
Most interesting thing i noticed when testing was an increase in bus load. I will be trying this again in the future on a pcie 5.0/6.0 board with dual x8 bifurcated slots. What kind of MOBO and CPU are you running?

*I tried this on a 5700g/b550 for the 6900xt/5500xt and on a 5800x3d/x570 for the 3070ti/4060.
I have 5800x3d/x570
rx590 had pretty high usage based onusage graph and it got pretty warm too. Its pretty interesting how the performance can get worse. I have no idea how much data is being moved with fg, maybe bus or something else is a bottleneck.
xXDeiviDXx Aug 15, 2024 @ 9:09am 
I don't know much about desktop PCs but its much easier to setup this on gaming laptops due to the monitor generally being connected to the integrated graphics. Maybe it's something about LS only working on the GPU that's outputting the display onto a screen while the other GPU only has to render the game in the first place?
Last edited by xXDeiviDXx; Aug 15, 2024 @ 12:31pm
Spook Aug 15, 2024 @ 9:20am 
Originally posted by Kapteeni Moukku:
I have 5800x3d/x570
rx590 had pretty high usage based onusage graph and it got pretty warm too. Its pretty interesting how the performance can get worse. I have no idea how much data is being moved with fg, maybe bus or something else is a bottleneck.
Yep, happened in my case to, the slave-cards were loaded up. And the bus usage, while higher than typical, was nowhere near 100%.

Originally posted by xXDeiviDXx:
I don't know much about desktop PCs but its much easier to setup this on gaming laptops due to the monitor generally being connected to the integrated graphics. Maybe it's something about LS only working on the GPU that's outputting the display onto a screen while the other GPU only has to the job of rendering the game in the first place?
I believe that plays a big part, yes. There's no way around shuffling around big amounts of data when doing this. I remember trying the output on either card, but no beans. Though people have had success on desktops.
Xavvy Aug 15, 2024 @ 12:04pm 
It's honestly not a reliable solution. eGPU is basically pointless method as the way this method simultaneously renders is not very synced and the result is tremendous input latency and really f'ing bad frame pacing. The method of two dGPUs is better but it really boils down to the quality of the mobo I think(not 100% positive).. and I hate to say it but most don't stand up to muster imo.

I have one PC where it works perfectly.. even though I have many extra GPUs lying around and I always overkill my PSUs. The only rig that works properly is the one with an actual render rig mobo that's designed for many GPUs running simutaneously, where it's designed to run the slave GPU in proper sync at max bandwidth.. on paper it shouldn't need this, in reality it very much does.

The problem I found is for some damned reason is conflicting memory interfaces. I believe this is what it's tied to. When I used a 384bit with a 192bit card it was all just goofy even though it was completely overkill for that 192bit is still didn't work. However when I used two 256bit memory interface cards it worked perfectly as long as the mobo could handle it.

I mean i'm probably wrong and not understanding something here but it's just what I noticed in my testing. Also if your iGPU can handle it on a laptop, it works tremendously well. my RTX 4080 laptop with UHD770 handles a fair amount of LSFG. I can use it to comfortably upscale+framegen from 40fps to 120fps @ 1080p upscaled.. not really anymore than that without hitting beyond 85% on it.
Last edited by Xavvy; Aug 15, 2024 @ 12:08pm
Spook Aug 15, 2024 @ 12:52pm 
Originally posted by Xavvy:
The only rig that works properly is the one with an actual render rig mobo that's designed for many GPUs running simutaneously.
This probably helps a lot. Is it running a workstation CPU?

Also if your iGPU can handle it on a laptop, it works tremendously well. my RTX 4080 laptop with UHD770 handles a fair amount of LSFG. I can use it to comfortably upscale+framegen from 40fps to 120fps @ 1080p upscaled.. not really anymore than that without hitting beyond 85% on it.
This probably also has something to do with it. If i recall correctly atleast Nvidia doesn't allow in it's public release drivers to do what it does allow in it's laptop drivers, namely choosing which GPU to run a program on at the OS-level. If i recall my short research correctly someone found that the Optimus stuff actually is in the desktop drivers but it's not exposed. Though Linux users seemingly are able to make use of it.
Xavvy Aug 15, 2024 @ 5:01pm 
Originally posted by Spook:
Originally posted by Xavvy:
The only rig that works properly is the one with an actual render rig mobo that's designed for many GPUs running simutaneously.
This probably helps a lot. Is it running a workstation CPU?

Also if your iGPU can handle it on a laptop, it works tremendously well. my RTX 4080 laptop with UHD770 handles a fair amount of LSFG. I can use it to comfortably upscale+framegen from 40fps to 120fps @ 1080p upscaled.. not really anymore than that without hitting beyond 85% on it.
This probably also has something to do with it. If i recall correctly atleast Nvidia doesn't allow in it's public release drivers to do what it does allow in it's laptop drivers, namely choosing which GPU to run a program on at the OS-level. If i recall my short research correctly someone found that the Optimus stuff actually is in the desktop drivers but it's not exposed. Though Linux users seemingly are able to make use of it.

Naww. It's a creation pro mobo that has full support for 3x GPUs and really high watt PSUs. It's a 14900ks with 64gb DDR5 stock OC'd ram and 4090. I just tried to use a second GPU for it but frame gen wouldn't work right until i slotted two 256bit GPUs so i took out the 4090 and slotted my 4080 and 4070ti super and then it worked great. i originally tried a 4090 with 7700xt as slave but it worked like crap for frame gen.

I know 7950x/x3d is better thermals and slightly better gaming performance but 14900ks(yes it's patched and not exploding.. runs great and actually has better thermals than my 7950x3d rig but only because it's got a bigger AIO lol - 360mm) does have pretty substantial lead in content creation which is what i mainly do. Gaming is a side thing and never takes precedence in my builds.. My next go around I will be going entry level threadripper but for now a $2500 workstation CPU isn't quite necessary.. YET.. lol..

If AMD released 24 core CPU in the 7000 series I would of definitely gone with red team for content creation.
Spook Aug 16, 2024 @ 3:27am 
Originally posted by Xavvy:
I just tried to use a second GPU for it but frame gen wouldn't work right until i slotted two 256bit GPUs so i took out the 4090 and slotted my 4080 and 4070ti super and then it worked great.
Odd that this apparently matters. May be to do with ReBar is the only thing i can think of.

If AMD released 24 core CPU in the 7000 series I would of definitely gone with red team for content creation.
A 20/24-core 9950X3D with one V-cache CCD and one Zen 5c cluster would probably have sold a lot better than the 9950X3D that's likely coming. It would likely have made all dual-CCD parts more useful, especially the 9900X(3D) parts. Something about AMD never missing an opportunity..
Xavvy Aug 16, 2024 @ 12:03pm 
Originally posted by Spook:
Originally posted by Xavvy:
I just tried to use a second GPU for it but frame gen wouldn't work right until i slotted two 256bit GPUs so i took out the 4090 and slotted my 4080 and 4070ti super and then it worked great.
Odd that this apparently matters. May be to do with ReBar is the only thing i can think of.

If AMD released 24 core CPU in the 7000 series I would of definitely gone with red team for content creation.
A 20/24-core 9950X3D with one V-cache CCD and one Zen 5c cluster would probably have sold a lot better than the 9950X3D that's likely coming. It would likely have made all dual-CCD parts more useful, especially the 9900X(3D) parts. Something about AMD never missing an opportunity..

I just wish Intel would improve. I've always been blue team first but I've been pretty let down with 14th gen tbh. Yeah it beats 7000 series in productivity which is nice for me but I also game a lot so I like to have my cake and eat it too. It would of been nice if the 14th gen would of matched the 7950x3d in gaming performance.. I mean it's close but 10% is a pretty good push ahead. But the 9000 releasing just puts it to shame most likely and they aren't released that far apart. I doubt the 15th gen Intel will be much better than the 14th.

For next gen I really hope Intel makes a chip with a bit better thermals/efficiency and I would really like for them to double down on making there chips good for productivity even more.. This is where they are clearly still winning. The iGPU it has isn't for light gaming but it does content creation really well boosting speeds of apps considerably and the core count they have really helps in this department.. So I would like to see an middle of the road chip.

I can handle the next high end i9 or core X being 10% less effective in gaming vs a dedicated gaming chip like the x3Ds if the productivity is bonkers good. Like it would be nice to see an i9 15th gen with 32 cores(8P cores and 48/64 threads or something) so I don't have to opt for a $2500 CC CPU that I will only use for one thing. It would be nice to have everything I need in two PCs with a third lighter build for multi-tasking instead of having to have four gdamn builds to make up the difference.
Last edited by Xavvy; Aug 16, 2024 @ 12:04pm
Spook Aug 16, 2024 @ 12:22pm 
Originally posted by Xavvy:
Originally posted by Spook:
Odd that [...] an opportunity..

I just wish Intel would improve. I've always been blue team first but I've been pretty let down with 14th gen tbh. Yeah it beats 7000 series in productivity which is nice for me but I also game a lot so I like to have my cake and eat it too. It would of been nice if the 14th gen would of matched the 7950x3d in gaming performance.. I mean it's close but 10% is a pretty good push ahead.
I think they were a bit surprised by the rapid iteration that Zen's modular nature allowed for and by the performance that AMD has been able to get out of, up to then, relatively delicate parts by using intricate boost-algorithms. And by X3D as well it seems, though they are now developing their own equivalent with Intel Adamantine, which is/was scheduled to release in one of the next gens of CPUs if i recall correctly.

For next gen I really hope Intel makes a chip with a bit better thermals/efficiency and I would really like for them to double down on making there chips good for productivity even more.. This is where they are clearly still winning.
I wouldn't put too much hope on Intel mesmerising for the next few years. They plan on/are moving more of their chip-production to their own fabs. Which is obviously the smart thing to do in the long run, but for now they apparently have quite some catching up to do to even match TSMC.

Like it would be nice to see an i9 15th gen with 32 cores(8P cores and 48/64 threads or something) so I don't have to opt for a $2500 CC CPU that I will only use for one thing.
That would of course be nice, but that would also be a quite niche part that i doubt would be able be implemented on a consumer socket. Or recover R&D-costs.
Last edited by Spook; Aug 16, 2024 @ 12:22pm
󠀡󠀡 Aug 16, 2024 @ 4:27pm 
That's a good point, my R9 7950X3D apparently is GPU 2 in task manager, I wonder which GPU the app is using...
Last edited by 󠀡󠀡; Aug 16, 2024 @ 4:28pm
Tsumugu777 Aug 16, 2024 @ 5:55pm 
Originally posted by xXDeiviDXx:
I don't know much about desktop PCs but its much easier to setup this on gaming laptops due to the monitor generally being connected to the integrated graphics. Maybe it's something about LS only working on the GPU that's outputting the display onto a screen while the other GPU only has to render the game in the first place?

On the topic, depending on some gaming laptops, the igpu might be too weak to handle FG. If you have an intel iGPU then most likely tough luck, gotta stick with the dGPU. If it's an AMD iGPU though, those are generally faster. They may handle the workload better.
xXDeiviDXx Aug 16, 2024 @ 7:06pm 
Originally posted by Tsumugu777:
Originally posted by xXDeiviDXx:
I don't know much about desktop PCs but its much easier to setup this on gaming laptops due to the monitor generally being connected to the integrated graphics. Maybe it's something about LS only working on the GPU that's outputting the display onto a screen while the other GPU only has to render the game in the first place?

On the topic, depending on some gaming laptops, the igpu might be too weak to handle FG. If you have an intel iGPU then most likely tough luck, gotta stick with the dGPU. If it's an AMD iGPU though, those are generally faster. They may handle the workload better.
My Intel iGPU can handle frame gen but only on performance mode, useful because it leaves my dGPU free to render whatever game I'm playing without having to worry about the extra overhead that LS requires.
Spook Aug 17, 2024 @ 12:29am 
Originally posted by xXDeiviDXx:
My Intel iGPU can handle frame gen but only on performance mode, useful because it leaves my dGPU free to render whatever game I'm playing without having to worry about the extra overhead that LS requires.
What model laptop, CPU and GPU?
xXDeiviDXx Aug 17, 2024 @ 7:49am 
Originally posted by Spook:
Originally posted by xXDeiviDXx:
My Intel iGPU can handle frame gen but only on performance mode, useful because it leaves my dGPU free to render whatever game I'm playing without having to worry about the extra overhead that LS requires.
What model laptop, CPU and GPU?
Asus TUF Dash F15:
-Intel i7-11370H
-RTX 3050 4gb
-24gb of RAM

Although I wonder if those AMD iGPUs can handle the full quality version of LS frame gen...
< >
Showing 1-15 of 20 comments
Per page: 1530 50

Date Posted: Aug 15, 2024 @ 8:09am
Posts: 20