Unreal Games using secondary GPU
Hi everyone!

Well as the title says I have a setup with 2 monitors and 2 gpus (A 3070 connected to monitor 1 and a 1070 connected to monitor 2), and recently I start to notice some games where running kinda weird and the DLSS option isn't available, so I start to freak out thinking my beloved 3070 is dying. But lucky me, the problem is only happening in certain games and digging a little I found that the culprit is Unreal Engine which for some reason decided to use my 1070 (on the monitor connected to the 3070 WTF!) on all the games that use that engine.

I manage to get this log from the last game:

LogD3D12RHI: Found D3D12 adapter 0: NVIDIA GeForce RTX 3070 (VendorId: 10de, DeviceId: 2484, SubSysId: 37513842, Revision: 00a1 LogD3D12RHI: Max supported Feature Level 12_2, shader model 6.7, binding tier 3, wave ops supported, atomic64 supported LogD3D12RHI: Adapter has 8018MB of dedicated video memory, 0MB of dedicated system memory, and 16334MB of shared system memory, 1 output[s] LogD3D12RHI: Driver Version: 565.90 (internal:32.0.15.6590, unified:565.90) LogD3D12RHI: Driver Date: 9-26-2024 LogD3D12RHI: Found D3D12 adapter 1: NVIDIA GeForce GTX 1070 (VendorId: 10de, DeviceId: 1b81, SubSysId: 61733842, Revision: 00a1 LogD3D12RHI: Max supported Feature Level 12_1, shader model 6.7, binding tier 3, wave ops supported, atomic64 supported LogD3D12RHI: Adapter has 8067MB of dedicated video memory, 0MB of dedicated system memory, and 16334MB of shared system memory, 1 output[s] LogD3D12RHI: Driver Version: 565.90 (internal:32.0.15.6590, unified:565.90) LogD3D12RHI: Driver Date: 9-26-2024 LogD3D12RHI: Found D3D12 adapter 2: Microsoft Basic Render Driver (VendorId: 1414, DeviceId: 008c, SubSysId: 0000, Revision: 0000 LogD3D12RHI: Max supported Feature Level 12_1, shader model 6.2, binding tier 3, wave ops supported, atomic64 unsupported LogD3D12RHI: Adapter has 0MB of dedicated video memory, 0MB of dedicated system memory, and 16334MB of shared system memory, 0 output[s] LogD3D12RHI: DirectX Agility SDK runtime found. LogD3D12RHI: Chosen D3D12 Adapter Id = 1

And as you can see Unreal Engine decided that the 1070 is the way to go...

I tried several "fixes" (changing the nVidia panel to use the 3070 for OpenGl, in the windows options too, nothing), the only thing that "works" was disabling the 1070 on the device manager, but that is noneless not optimal. Also tried to change manually the Engine.ini/GameUserSettings.ini and it works... but only in one of the games I tried

Anyone encountered this problem and found a way to solve it without disabling secondary GPU? Anyidea to try it out?

Thanks in advance!
< >
Mostrando 1-10 de 10 comentarios
Screamin' Doctor 24 OCT 2024 a las 8:57 
In some games exist the possibility to force which GPU is use, you just have to change manually the Engine.ini in each game, located in the related folder:
%localappdata%\**(GameName)**\Saved\Config\Windows
Just add the following line to Engine.ini:
[/script/engine.renderersettings] r.GraphicsAdapter=* (the * is you desired GPU adapter id NUMBER, 0, 1, etc..) Don't forget to make a copy of Engine.ini BEFORE you change anything in case you make a mistake.
ATM this only worked for Dark and Darker.

Another thing to try is:
Right click on the .exe of the game and select Render OpenGL on > Select target GPU.
This worked for me on Satisfactory (this make the game runs in Dx11 tho...)

Some UE games work fine without any trouble shooting, like Conan Exiles.
Última edición por Screamin' Doctor; 24 OCT 2024 a las 12:46
Iceira 24 OCT 2024 a las 9:46 
Most games use Default primary Graphic card and same with Monitor, so few game actual do a option run with this GPU card as alternative.

Maybe you should check windows system, nvidia has nvidia controll panel.
AMD has semi same way, in controll what card is default.


Example Bethesda launcher, give you option for not use Main card.


ps.
Don't forget pc old ways was build for 1 Monitor and 1 Gpu card, and we still suffer from it
if we want multi choise's.
Última edición por Iceira; 24 OCT 2024 a las 10:58
Screamin' Doctor 24 OCT 2024 a las 10:28 
Publicado originalmente por Iceira:
Most games use Default primary Graphic card and same with Monitor, so few game actual do a option run with this GPU card as alternative.

Maybe you should check windows system, nvidia has nvidia controll panel.
AMD has semi same way, in controll what card is default.


Example Bethesda launcher, give you option for not use Main card.

Already check and double check Default GPU and monitor, (the games run in my primary monitor even if they are been render with my secondary GPU which is not connected to that monitor).

Weirdly enough this just start to happen with more recent UE games so I'm inclined to think there is some new method of detecting which GPU to use in modern UE versions (I suspect the engine just check which GPU have more RAM and choose that as the best option).
Iceira 24 OCT 2024 a las 11:00 
You can try ask them games forum or game support. but atleast you know why i ask for what default windows settings point at., i can google a untiy game i have and check but i bet i dont have that issue.

2 sec let me dinner first and i will try locate a game from them.

already test this 2 days ago game engine not the problem car sim 2015
Última edición por Iceira; 24 OCT 2024 a las 11:05
Screamin' Doctor 24 OCT 2024 a las 11:07 
Publicado originalmente por Iceira:
"a untiy game"

I mean Unreal Engine, sorry about the confusion, Unity games run perfectly (Well any other game engine really).
Iceira 24 OCT 2024 a las 11:13 
Try ask game own support or thta forum for that game maybe other there know something.
Iceira 24 OCT 2024 a las 11:19 
Make sure its not Nvidia experience that do a optimize all and then use DSR Resolution and then pick other card, maybe its this, dont forget i and other are use to Nvidia put DSR to diablo3 and they did this way back then card barely could handle them.

i just thougth of this.
Última edición por Iceira; 24 OCT 2024 a las 11:20
76561198407601200 24 OCT 2024 a las 11:24 
bit of a stretch but does your bios have an option to select primary gpu?
Screamin' Doctor 24 OCT 2024 a las 12:23 
Publicado originalmente por Iceira:
Make sure its not Nvidia experience that do a optimize all and then use DSR Resolution and then pick other card, maybe its this, dont forget i and other are use to Nvidia put DSR to diablo3 and they did this way back then card barely could handle them.

i just thougth of this.

Nvidia experience optimisation is on manual mode.

Publicado originalmente por 76561198407601200:
bit of a stretch but does your bios have an option to select primary gpu?

For a moment I thought you were into something, because I forgot to check the bios before.
Checking the options the primary GPU was set on auto (IGP is disabled if anyone is wondering) so I try to force the 3070 as my main from the bios, sadly UE is obtuse about it and keeps pushing my 1070 to render games... :Sploder100:

Thank you both, for the suggestions!
Iceira 24 OCT 2024 a las 12:42 
Dont forget GPU card support. ( game devs cant fix what maybe is a GPU issue or its driver. with it.)
Última edición por Iceira; 24 OCT 2024 a las 12:44
< >
Mostrando 1-10 de 10 comentarios
Por página: 1530 50

Publicado el: 16 OCT 2024 a las 9:32
Mensajes: 10