Tom Clancy's Rainbow Six Siege

Tom Clancy's Rainbow Six Siege

[HELP] R6 not using my dedicated AMD Gpu
I literally don't know where else to post about this.
What I have tried so far:

> "GPUAdapter 1", what that did was make it even worse, it didn't even use the Intel GPU, instead it used the "Microsoft Basic Render Driver" which had 0 MB of memory and ran at like 1 fps in the menu.

> Tried changing GameSettings.ini to GameSetting.ini, that didn't work, it made another "GameSettings.ini"

Im updating drivers right now, any help would be appreciated

EDIT: Tried disabling the Integrated GPU, it just made it use the Microsoft Basic Render Drive again.

Also setting it to launch as admin, makes it that it never actually launches
Last edited by i have crippling depression; Feb 20, 2017 @ 4:43pm
< >
Showing 1-15 of 22 comments
YellowFlashKid Feb 20, 2017 @ 4:53pm 
Here is what you do...

1. Click with the right button on your mouse on the Desktop.
2. A small list will open.
3. Find AMD Catalyst/Radeon Center (Should be the first one).
4. Go to Switchable Graphics Application Settings
5. Find your RainbowSixSiege.exe or just add it.
6. Set the application to High Performance.
7. Save it.
8. You're done !

Hope this helps!

Edit: Go to the GameSetting.ini and put the GPURender to 0. That way the game will be able to start.
Last edited by YellowFlashKid; Feb 20, 2017 @ 4:54pm
Originally posted by MetalHead_1989:
Here is what you do...

1. Click with the right button on your mouse on the Desktop.
2. A small list will open.
3. Find AMD Catalyst/Radeon Center (Should be the first one).
4. Go to Switchable Graphics Application Settings
5. Find your RainbowSixSiege.exe or just add it.
6. Set the application to High Performance.
7. Save it.
8. You're done !

Hope this helps!

Edit: Go to the GameSetting.ini and put the GPURender to 0. That way the game will be able to start.
Already added it to High Performance, still uses the intel gpu.
Whats the GPURender 0? Am I supposed to add that in? I don't have it in the .ini afaik
qwerty Feb 20, 2017 @ 7:25pm 
you could go into your BIOS and just deactivate your on board gpu altogether. it should force all applications to use your dedicated GPU
Originally posted by qwerty:
you could go into your BIOS and just deactivate your on board gpu altogether. it should force all applications to use your dedicated GPU
If I do, I can turn it back on after that if need be, right?
It's just I'm getting the feeling that it would use the Microsoft Render Driver again if I do
YellowFlashKid Feb 21, 2017 @ 2:52am 
Originally posted by I have crippling depression:
Originally posted by qwerty:
you could go into your BIOS and just deactivate your on board gpu altogether. it should force all applications to use your dedicated GPU
If I do, I can turn it back on after that if need be, right?
It's just I'm getting the feeling that it would use the Microsoft Render Driver again if I do
Ok so when you go into the game and go into the graphics menu does it say Intel Graphics .... ?


Originally posted by MetalHead_1989:
Originally posted by I have crippling depression:
If I do, I can turn it back on after that if need be, right?
It's just I'm getting the feeling that it would use the Microsoft Render Driver again if I do
Ok so when you go into the game and go into the graphics menu does it say Intel Graphics .... ?
If I just play normally, then yes. If I turn off the intel gpu through the device manager, then it uses the Microsoft Render Driver
YellowFlashKid Feb 21, 2017 @ 7:18am 
Originally posted by I have crippling depression:
Originally posted by MetalHead_1989:
Ok so when you go into the game and go into the graphics menu does it say Intel Graphics .... ?
If I just play normally, then yes. If I turn off the intel gpu through the device manager, then it uses the Microsoft Render Driver
Dont turn it off. Mine says Intel HD Grapchis too, but the game uses my AMD card. Just work out the graphics and you must get a stable 50 FPS.

archaic Feb 21, 2017 @ 7:48am 
Why do people use AMD anyways? Like really.. I understand it's economy-friendly but, it's bad. Just get yourself a GTX 1050 or something.
YellowFlashKid Feb 21, 2017 @ 8:00am 
Originally posted by .:RΞDMiST:. #Inconsistent ::
Why do people use AMD anyways? Like really.. I understand it's economy-friendly but, it's bad. Just get yourself a GTX 1050 or something.
Why do people use Nvidia ? Same ♥♥♥♥.
archaic Feb 21, 2017 @ 8:12am 
No, not "Same ♥♥♥♥."

Nvidia is much better than AMD.
YellowFlashKid Feb 21, 2017 @ 8:30am 
Originally posted by .:RΞDMiST:. #Inconsistent ::
No, not "Same ♥♥♥♥."

Nvidia is much better than AMD.
It goes to budget, my friend. I don't think you realize that not every person in the world can afford a GTX 1070.
archaic Feb 21, 2017 @ 8:31am 
Originally posted by .:RΞDMiST:. #Inconsistent ::
I understand it's economy-friendly

._. broe
Originally posted by .:RΞDMiST:. #Inconsistent ::
Why do people use AMD anyways? Like really.. I understand it's economy-friendly but, it's bad. Just get yourself a GTX 1050 or something.
Since I was going for a "gaming" laptop, the one I picked only had AMD as an option
Originally posted by MetalHead_1989:
Originally posted by I have crippling depression:
If I just play normally, then yes. If I turn off the intel gpu through the device manager, then it uses the Microsoft Render Driver
Dont turn it off. Mine says Intel HD Grapchis too, but the game uses my AMD card. Just work out the graphics and you must get a stable 50 FPS.
I'm pretty sure its not using it since I get 20-40 fps in game on lowest settings
Can anyone suggest anything else?
< >
Showing 1-15 of 22 comments
Per page: 1530 50

Date Posted: Feb 20, 2017 @ 4:42pm
Posts: 22