Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
*I tried this on a 5700g/b550 for the 6900xt/5500xt and on a 5800x3d/x570 for the 3070ti/4060.
rx590 had pretty high usage based onusage graph and it got pretty warm too. Its pretty interesting how the performance can get worse. I have no idea how much data is being moved with fg, maybe bus or something else is a bottleneck.
I believe that plays a big part, yes. There's no way around shuffling around big amounts of data when doing this. I remember trying the output on either card, but no beans. Though people have had success on desktops.
I have one PC where it works perfectly.. even though I have many extra GPUs lying around and I always overkill my PSUs. The only rig that works properly is the one with an actual render rig mobo that's designed for many GPUs running simutaneously, where it's designed to run the slave GPU in proper sync at max bandwidth.. on paper it shouldn't need this, in reality it very much does.
The problem I found is for some damned reason is conflicting memory interfaces. I believe this is what it's tied to. When I used a 384bit with a 192bit card it was all just goofy even though it was completely overkill for that 192bit is still didn't work. However when I used two 256bit memory interface cards it worked perfectly as long as the mobo could handle it.
I mean i'm probably wrong and not understanding something here but it's just what I noticed in my testing. Also if your iGPU can handle it on a laptop, it works tremendously well. my RTX 4080 laptop with UHD770 handles a fair amount of LSFG. I can use it to comfortably upscale+framegen from 40fps to 120fps @ 1080p upscaled.. not really anymore than that without hitting beyond 85% on it.
This probably also has something to do with it. If i recall correctly atleast Nvidia doesn't allow in it's public release drivers to do what it does allow in it's laptop drivers, namely choosing which GPU to run a program on at the OS-level. If i recall my short research correctly someone found that the Optimus stuff actually is in the desktop drivers but it's not exposed. Though Linux users seemingly are able to make use of it.
Naww. It's a creation pro mobo that has full support for 3x GPUs and really high watt PSUs. It's a 14900ks with 64gb DDR5 stock OC'd ram and 4090. I just tried to use a second GPU for it but frame gen wouldn't work right until i slotted two 256bit GPUs so i took out the 4090 and slotted my 4080 and 4070ti super and then it worked great. i originally tried a 4090 with 7700xt as slave but it worked like crap for frame gen.
I know 7950x/x3d is better thermals and slightly better gaming performance but 14900ks(yes it's patched and not exploding.. runs great and actually has better thermals than my 7950x3d rig but only because it's got a bigger AIO lol - 360mm) does have pretty substantial lead in content creation which is what i mainly do. Gaming is a side thing and never takes precedence in my builds.. My next go around I will be going entry level threadripper but for now a $2500 workstation CPU isn't quite necessary.. YET.. lol..
If AMD released 24 core CPU in the 7000 series I would of definitely gone with red team for content creation.
A 20/24-core 9950X3D with one V-cache CCD and one Zen 5c cluster would probably have sold a lot better than the 9950X3D that's likely coming. It would likely have made all dual-CCD parts more useful, especially the 9900X(3D) parts. Something about AMD never missing an opportunity..
I just wish Intel would improve. I've always been blue team first but I've been pretty let down with 14th gen tbh. Yeah it beats 7000 series in productivity which is nice for me but I also game a lot so I like to have my cake and eat it too. It would of been nice if the 14th gen would of matched the 7950x3d in gaming performance.. I mean it's close but 10% is a pretty good push ahead. But the 9000 releasing just puts it to shame most likely and they aren't released that far apart. I doubt the 15th gen Intel will be much better than the 14th.
For next gen I really hope Intel makes a chip with a bit better thermals/efficiency and I would really like for them to double down on making there chips good for productivity even more.. This is where they are clearly still winning. The iGPU it has isn't for light gaming but it does content creation really well boosting speeds of apps considerably and the core count they have really helps in this department.. So I would like to see an middle of the road chip.
I can handle the next high end i9 or core X being 10% less effective in gaming vs a dedicated gaming chip like the x3Ds if the productivity is bonkers good. Like it would be nice to see an i9 15th gen with 32 cores(8P cores and 48/64 threads or something) so I don't have to opt for a $2500 CC CPU that I will only use for one thing. It would be nice to have everything I need in two PCs with a third lighter build for multi-tasking instead of having to have four gdamn builds to make up the difference.
I wouldn't put too much hope on Intel mesmerising for the next few years. They plan on/are moving more of their chip-production to their own fabs. Which is obviously the smart thing to do in the long run, but for now they apparently have quite some catching up to do to even match TSMC.
That would of course be nice, but that would also be a quite niche part that i doubt would be able be implemented on a consumer socket. Or recover R&D-costs.
On the topic, depending on some gaming laptops, the igpu might be too weak to handle FG. If you have an intel iGPU then most likely tough luck, gotta stick with the dGPU. If it's an AMD iGPU though, those are generally faster. They may handle the workload better.
-Intel i7-11370H
-RTX 3050 4gb
-24gb of RAM
Although I wonder if those AMD iGPUs can handle the full quality version of LS frame gen...