Call of Duty: Advanced Warfare

Call of Duty: Advanced Warfare

View Stats:
Pvt. Parts Nov 4, 2014 @ 2:52pm
Advanced Warfare Using Intel Graphics, Not GPU; (Desktop not Laptop)
I've read a ton of forums for other games and some for AW on how to make the game run via Nvidia cards rather than the built-in Intel graphics. I've yet to find a solution. Most of the occurrences I see are on laptops and a result of power settings/Optimus settings. I however have a desktop, so that isn't an issue for me. I've tried modifying the config files, changing the programs to use the GPU in the Nvidia control panel, etc. I'm running three monitors (2 on my Nvidia, 1 on my Intel plugged into the motherboard ports), and when I run the game it automatically defaults to the monitor that's plugged into the motherboard DVI port. I can make the game windowed and drag it to my main monitor, but if I then try to make it full screen, it pops back over to my 3rd monitor that's using the intel graphics. If I unplug than monitor or disable the intel graphics and try to run the game, I get a directX error at start up. I even tried disabling the Intel graphics and installing the game fresh, but it still generates a directX error and shows in the config file that it's set to use the Intel HD graphics. It's still early for problems to arise with AW since it was released yesterday/today, but I'm figuring more people will have this problem. Hopefully we can find a fix, let me know if anyone has suggestions! Thanks.
Last edited by Pvt. Parts; Nov 4, 2014 @ 3:00pm
< >
Showing 1-4 of 4 comments
CyclopsOdin Nov 5, 2014 @ 10:37am 
Seconded! Very frustrating. Hope there is a fix soon.
Dante Nov 5, 2014 @ 11:02am 
Originally posted by Pvt. Parts:
I've read a ton of forums for other games and some for AW on how to make the game run via Nvidia cards rather than the built-in Intel graphics. I've yet to find a solution. Most of the occurrences I see are on laptops and a result of power settings/Optimus settings. I however have a desktop, so that isn't an issue for me. I've tried modifying the config files, changing the programs to use the GPU in the Nvidia control panel, etc. I'm running three monitors (2 on my Nvidia, 1 on my Intel plugged into the motherboard ports), and when I run the game it automatically defaults to the monitor that's plugged into the motherboard DVI port. I can make the game windowed and drag it to my main monitor, but if I then try to make it full screen, it pops back over to my 3rd monitor that's using the intel graphics. If I unplug than monitor or disable the intel graphics and try to run the game, I get a directX error at start up. I even tried disabling the Intel graphics and installing the game fresh, but it still generates a directX error and shows in the config file that it's set to use the Intel HD graphics. It's still early for problems to arise with AW since it was released yesterday/today, but I'm figuring more people will have this problem. Hopefully we can find a fix, let me know if anyone has suggestions! Thanks.

Have you tried uninstalling the game, THEN disabling the onboard graphics card, THEN reinstalling?

If you do that, the game will detect your Nvidia card and set that as the default device , and it will install the required .NET Framework drivers for it. Because the DX error you are getting will be because of that - the .NET Framework drivers for Nvidia haven't been installed on installation.
HeWhoMustEAT May 4, 2015 @ 3:43pm 
i have the exact same problem? have u found any solution u may share? or you are still like me unabled to play?
Infinity7 May 4, 2015 @ 9:52pm 
Look in your multiplayer config file and see what it says for

seta r_adapter " "

and make it says something like "NVIDIA GeForce GTX 980" or whatever your card is.
< >
Showing 1-4 of 4 comments
Per page: 1530 50

Date Posted: Nov 4, 2014 @ 2:52pm
Posts: 4