Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I tried Baldur’s Gate 3 with this setup, and I can confirm that the game was definitely using the RTX 3060 for rendering, while the GTX 1660 was handling Lossless Scaling. Task Manager clearly showed both GPUs in use as expected.
I had a similar problem to yours, but I was recently able to fix it.
To get straight to the conclusion: I solved the issue by swapping my GPUs - putting the secondary GPU in the PCIe x16 slot 1 and the main GPU in the x4 slot 2, then outputting the display through the secondary GPU, which resulted in normal operation.
My PC configuration is: RTX 4080 as the main GPU and RTX 2060 as the secondary GPU.
When I tried using the RTX 2060 for frame generation, the frame rate not only failed to increase but actually decreased significantly at WQHD and 4K resolutions. Looking at GPU utilization, the 4080 was normal, but the 2060 was reaching 100% utilization.
At first, I thought it was due to the RTX 2060's insufficient performance, but I noticed the power consumption was low despite the 100% utilization.
Upon closer examination, I found that VRAM and "3D" was barely being used, while "Bus Interface" and "Copy" were at 100%.
It seems the x4 lanes in slot 2 didn't provide enough bandwidth. Additionally, displaying the monitor through the secondary GPU reduced the "Copy" utilization.
Of course, I was concerned that putting the main GPU in the x4 slot 2 would reduce the base frame rate, but the decrease was about 10%.
I'm not sure if simply rearranging your GPUs will fix your issue, but I hope this solution works for you as well.
I had been running at 4K 120fps with HDR enabled, but when I turned HDR off, the main and sec GPUs usage and power consumption became almost nomal, so frame rate without dropping was achieved
Although turning off HDR does reduce bandwidth, I think it is very strange that it would make such a significant difference.
It's a bit disappointing, but I'll play games with HDR turned off from now on.
*Make sure that i'ts also enabled in the BIOS settings.
RX 6400 (PCI-E 3.0 x4).
Display 1 - RTX 3060 Ti
Display 2 - RTX 3060 Ti
Display 1 (again) - RX 6400
Note: Monitors are 1080p, 120Hz
Displays (DP and HDMI) connected to RTX 3060 Ti are on 2880x1620 (via DLDSR)
Display 1 (HDMI) connected to RX 6400 is on 3072x1728 (via VSR)
When I want LS FG, I switch to use Display 1 using RX 6400 and run game on Display 2 using RTX 3060 Ti.
I use game render at 2880x1620 with DLSS or other upscaling on RTX 3060 Ti and the LS FGx3 (with LS1 Upscale sharpness 1) on RX 6400 output. From 40FPS to 120Hz FG.
Works fine for me with my setup and configuration or resolutions
- Kingdom Come Deliverance 2 (DLSS)
- Cyberpunk (Path Tracing) (DLSS Perf) 25FPS to FGx2 for 50Hz or even FGx3
- Expedition 33 (DLSS Quality) 20-40FPS to FGx3 60-120Hz
All of the above at game render of 2880x1620
Note: the DLDSR Upscale doesn't have any effect on the FG output, except running the game at that resolution. Because the final image output of FG is via AMD GPU in VSR. I mainly use it to subtlely increase my image quality when using Displays on RTX GPU. So you can omit that part. I mentioned them for reference if people would ask "Is it working multiple bells and whistles?".
Note 2: Should run fine on Single Monitor too. But again, I mentioned my weird multiple display settings incase someone asks about "other combinations".