Fallout 4

Fallout 4

View Stats:
Snoop Nov 11, 2015 @ 10:35pm
Fallout 4 fails to recognize my GPU
I'm on a laptop with a Nvidia Geforce GT 730M and Fallout 4 is the only game that will not use my GPU. Instead, it uses my CPU's integrated graphics, the Intel HD Graphic 4000.

I have tried setting the game and the launcher to use my GPU in the Nvidia control panel.
I have tried editing the "sD3DDevice=" line in fallout4prefs.ini in the My Games > Fallout 4 directory by typing in "=NVIDIA GeForce GT 730M, replacing =Intel(R) HD Graphics 4000. However, it just reverts back upon launch.
I have tried installing previous drivers to see if it was just the recent "Game Ready" driver's fault, but to no avail.

I'm totally at a loss for what to do. I just want to play the game without <30 fps at lowest settings. I want to use my actual graphics card. Can someone please help me out?
< >
Showing 1-15 of 25 comments
Kaldaien Nov 11, 2015 @ 11:06pm 
Grab NVIDIA Inspector and go to town on the Shim settings. I have a tool to do this for Tales of Zestiria (setup Optimus correctly), but I'd have to re-write some code to make it work here. I'm not ruling it out just yet, but I have bigger fish to fry right now with

http://steamcommunity.com/app/377160/discussions/0/496881136898870103/

Incidentally, part of what that DLL does is skip the integrated card and have the game go straight to the discrete. If you're on Windows 8.1 or newer you may want to give it a try.
Viper Nov 11, 2015 @ 11:07pm 
It only identifies your GPU to preset suggested video settings. However with a 730M your way below minimum recommended GPU.
Last edited by Viper; Nov 11, 2015 @ 11:09pm
Wrayday Nov 11, 2015 @ 11:09pm 
You need to set your nVidia 3d application profile for Fallout4 to "Use nVidia 750M" instead of use default.

If a profile isn't there already for the game you need to click add program then find your fallout 4.exe and create one.
Viper Nov 11, 2015 @ 11:12pm 
However if you don't want it to modify the ini file simply right click on the file..select properties and click the box on the bottom that says read only. That will make it so the file can not be modified.
Last edited by Viper; Nov 11, 2015 @ 11:13pm
Wrayday Nov 11, 2015 @ 11:17pm 
I would try adding the game profile in the nVidia setting 1st since it utilizing Optimus.
Snoop Nov 11, 2015 @ 11:42pm 
Originally posted by Wrayday:
You need to set your nVidia 3d application profile for Fallout4 to "Use nVidia 750M" instead of use default.

If a profile isn't there already for the game you need to click add program then find your fallout 4.exe and create one.


Originally posted by Wrayday:
I would try adding the game profile in the nVidia setting 1st since it utilizing Optimus.

I've already done this.
robotec2000 Nov 11, 2015 @ 11:56pm 
i run a 860m... 2048... that is way over enough to play on high and i have his same problem but wont go to full screen
Kaldaien Nov 15, 2015 @ 3:06am 
OP: Set iAdapter=1 in Fallout4.ini
Last edited by Kaldaien; Nov 15, 2015 @ 3:07am
Snoop Nov 15, 2015 @ 7:24pm 
Originally posted by Kaldaien:
OP: Set iAdapter=1 in Fallout4.ini
Under Display, right?
scott88008 Nov 15, 2015 @ 7:34pm 
Awaiting new drivers fom Nvidia or update from Bethesda.(game plays fine on my mid-range gaming desktop)I uninstalled the Nvidia drivers on my Lenovo Y50-70 in safe mode using a display driver uninstaller then reinstalled divers using geforce experience. In Geforce console I selected GTX869M for 3d and pyhsix settings and desktop tab. Game now loads with GTX860m but still crashes on high settings and crashes intermitantly on low settings.
Kaldaien Nov 15, 2015 @ 7:41pm 
Originally posted by Snoop:
Originally posted by Kaldaien:
OP: Set iAdapter=1 in Fallout4.ini
Under Display, right?

Wherever that is. It's set to iAdapter=0 by default, but on your system thanks to Optimus not working correctly, adapter 0 is your integrated one.

This is generally a cosmetic change. If your driver's not completely screwed up, it uses something called shim rendering and will immediately transition from the integrated to discrete GPU. Setting iAdapter=1 in this case is basically just going to make sure that the name of the GPU is correct in the INI file ;)
Last edited by Kaldaien; Nov 15, 2015 @ 7:42pm
Snoop Nov 15, 2015 @ 10:19pm 
Originally posted by Kaldaien:
Originally posted by Snoop:
Under Display, right?

Wherever that is. It's set to iAdapter=0 by default, but on your system thanks to Optimus not working correctly, adapter 0 is your integrated one.

This is generally a cosmetic change. If your driver's not completely screwed up, it uses something called shim rendering and will immediately transition from the integrated to discrete GPU. Setting iAdapter=1 in this case is basically just going to make sure that the name of the GPU is correct in the INI file ;)
It was in falloutprefs.ini

it doesn't seem to have any effect on performance though
Craylord Dec 16, 2015 @ 5:23pm 
Originally posted by snop:
I'm on a laptop with a Nvidia Geforce GT 730M and Fallout 4 is the only game that will not use my GPU. Instead, it uses my CPU's integrated graphics, the Intel HD Graphic 4000.

I have tried setting the game and the launcher to use my GPU in the Nvidia control panel.
I have tried editing the "sD3DDevice=" line in fallout4prefs.ini in the My Games > Fallout 4 directory by typing in "=NVIDIA GeForce GT 730M, replacing =Intel(R) HD Graphics 4000. However, it just reverts back upon launch.
I have tried installing previous drivers to see if it was just the recent "Game Ready" driver's fault, but to no avail.

I'm totally at a loss for what to do. I just want to play the game without <30 fps at lowest settings. I want to use my actual graphics card. Can someone please help me out?
Craylord Dec 16, 2015 @ 5:28pm 
I had a similar problem to where it used the integrated graphics card instead of my video card. I found that my hdmi cable was plugged into my motherboard instead of my video card hdmi plug so it will not recognize my video card and revert to the integrated graphics drive. Check how your monitor is connected to the mother board or the video card.
Craylord Dec 16, 2015 @ 5:36pm 
I was running a desktop FYI
< >
Showing 1-15 of 25 comments
Per page: 1530 50

Date Posted: Nov 11, 2015 @ 10:35pm
Posts: 25