Slime Rancher

Slime Rancher

View Stats:
Ashes Mar 27, 2018 @ 1:03pm
Slime Rancher using wrong GPU.
So, this is probably an uncommon problem and I don't believe it has a user end fix. It occurs specifically in Unity games on older Nividia Optimus systems.

Problem is that regardless of driver version, Nvidia Control Panel Settings and Intel Settings the game just refuses to run on the proper GPU. No matter how much options I tweak and how many different approaches I try it's useless.

Some Unity games don't have that problem (e.g. Rimworld). I assume it's because it uses an older version of Unity.

So both to the community and devs, if you guys can think of any unorthodox fixes please comment, so far I have not been able to solve this. Tried anything from drivers, to custom launch options, to tweaking the Nvidia and Intel Control Panels.

GPU is 635M 2GB
Integrated GPU is Intel HD 4000
CPU is Intel Core i7-3517U

My laptop is an Asus K56CM
< >
Showing 1-7 of 7 comments
FDru Mar 27, 2018 @ 6:58pm 
I have similar settings to yours and setting global Nvidia settings to always use the high performance driver works for every program. Are you certain that it is using the intel GPU (like are you using the nvidia GPU monitor)? If so you could try disabling the intel driver from device manager (just right-click, disable device) which would make it literally impossible for any program to use it.
Last edited by FDru; Mar 27, 2018 @ 6:58pm
Ashes Mar 28, 2018 @ 11:58am 
Originally posted by FDru:
I have similar settings to yours and setting global Nvidia settings to always use the high performance driver works for every program. Are you certain that it is using the intel GPU (like are you using the nvidia GPU monitor)? If so you could try disabling the intel driver from device manager (just right-click, disable device) which would make it literally impossible for any program to use it.

Right, should've mentioned this, the Intel GPU is the one connected to the laptop screen and the main monitor input. So I can't disable it.
RadSpaghetti Mar 28, 2018 @ 5:07pm 
Have you tried setting the Nvidia GPU as the default display adapter in the BIOS?
Ashes Mar 28, 2018 @ 5:31pm 
Originally posted by RadSpaghetti:
Have you tried setting the Nvidia GPU as the default display adapter in the BIOS?

Don't have any GPU options in the BIOS, tried updating and still have no GPU options.
RadSpaghetti Mar 28, 2018 @ 5:38pm 
Originally posted by Ashes:
Don't have any GPU options in the BIOS, tried updating and still have no GPU options.

That's a bit weird... is there anything listed to disable the onboard graphics? Sometimes they name these things strangely, not sure why. At any rate, that's my best guess, but I'll keep an eye on this thread until I can find any other solutions.
Ashes Mar 28, 2018 @ 5:43pm 
Originally posted by RadSpaghetti:
Originally posted by Ashes:
Don't have any GPU options in the BIOS, tried updating and still have no GPU options.

That's a bit weird... is there anything listed to disable the onboard graphics? Sometimes they name these things strangely, not sure why. At any rate, that's my best guess, but I'll keep an eye on this thread until I can find any other solutions.

Thank you, and the reason there isn't any options is because the intel GPU is connected to the display input port and laptop monitor. So there's literally no way of me disabling it without my screen/s turning off.
RadSpaghetti Mar 29, 2018 @ 10:36am 
I found some launch options for Unity here: https://docs.unity3d.com/Manual/CommandLineArguments.html

Under "Unity Standalone Player command line arguments", there is a launch option to set which adapter you want to use. I know you had tried various launch options before, but the full list felt worth mentioning. Don't have the opportunity to test it myself right now.
Last edited by RadSpaghetti; Mar 29, 2018 @ 10:38am
< >
Showing 1-7 of 7 comments
Per page: 1530 50

Date Posted: Mar 27, 2018 @ 1:03pm
Posts: 7