Mass Effect™: Andromeda Deluxe Edition

Mass Effect™: Andromeda Deluxe Edition

View Stats:
Effective gpu wrecker 2024
rtx4080s
up to 300W+ consuption (150-300 randomly), up to 99% load (60-99 randomly), 2700+ frequency all time...
Before I had gtx760 and didn't bother, but now this game practically burning my gpu for no friggin reason. Lowering monitor refresh rate to 85Hz do not help (fps down to 85 indeed but gpu still burning like I'm playing some ue5 title), max fps limit in nvidia conrtol panel do not work, user.cfg with GameTime.MaxVariableFps do not work.
Anyone know a working way to limit gpu usage for this game. Its just absurd for 2017 title. (2k resolution)
Control game have some issue when you use RT there, but not in such scale.
Last edited by prostoe_bydlo; Oct 6, 2024 @ 10:44am
< >
Showing 1-9 of 9 comments
sdack Oct 6, 2024 @ 11:13am 
Lowering a monitor's refreshrate does not help when vsync is turned off. Did you check if it is actually on? And it is not forced off in the driver but either turned on or set to "Application default"?

That the driver's own fps limiter fails is ominous, because driver settings overrule game settings. No game gets past the driver. Games only interface with DirectX, which interfaces with the driver. So it is either a user problem or a problem with driver settings.

Running it on an RTX 3060Ti and a GTX 960 cool as a kitten btw.
Last edited by sdack; Oct 6, 2024 @ 11:13am
prostoe_bydlo Oct 6, 2024 @ 11:30am 
vsync is on all time. Fps won't lower with monitor refresh rate if vsync is not on. But again vsync is on all time. G-sync is on too btw. Problem that game ignore those restrictions and still burn gpu (as I mentioned I practically see 85/120/160fps (for lowered refresh rate on monitor) in game via steam overlay, but load, power consuption, gpu chip frequency do not change much, I monitor them via aida64)
For almost half year I never saw any title that make my gpu draw over 310W... Andromeda is truly a galaxy here.
Last edited by prostoe_bydlo; Oct 6, 2024 @ 11:59am
sdack Oct 6, 2024 @ 1:00pm 
Originally posted by prostoe_bydlo:
Problem that game ignore those restrictions and still burn gpu (as I mentioned I practically see 85/120/160fps (for lowered refresh rate on monitor) in game via steam overlay, but load, power consuption, gpu chip frequency do not change much, I monitor them via aida64)
As I said, it is literally not possible for a game to ignore driver settings. The driver always has the last word. So something is off with the driver settings.

I suggest you try NVIDIA App, which is still in beta, but will replace GeForce Experience. Use it to optimise the game's settings and see if it helps.

https://www.nvidia.com/en-gb/software/nvidia-app/
Deo Oct 7, 2024 @ 10:04am 
It's not a game, it's modern videocard vendors.

If your GPU is overheating, it's GPU's manufacturer's fault - not the game's fault.

Good GPU never overheats, because it has good cooling system. It's pretty obvious.

If your GPU has bad cooling system, set fps lock to 60 in driver control panel.
Monitor Technology - Fixed Refresh.
Preferred Refresh Rate - Application Controlled.
Vertical Sync - Off.

In Documents\BioWare\Mass Effect Andromeda\Save go to ProfOpts_profile
open it with Notepad and edit GstRender.FullscreenRefreshRate to 60.000000

My 3070 gives me 57C temperature delta at 60 fps on all ultra setting in this 7 years old game.
prostoe_bydlo Oct 7, 2024 @ 12:29pm 
Originally posted by Deo:
It's not a game, it's modern videocard vendors.

If your GPU is overheating, it's GPU's manufacturer's fault - not the game's fault.

Good GPU never overheats, because it has good cooling system. It's pretty obvious.

If your GPU has bad cooling system, set fps lock to 60 in driver control panel.
Monitor Technology - Fixed Refresh.
Preferred Refresh Rate - Application Controlled.
Vertical Sync - Off.

In Documents\BioWare\Mass Effect Andromeda\Save go to ProfOpts_profile
open it with Notepad and edit GstRender.FullscreenRefreshRate to 60.000000

My 3070 gives me 57C temperature delta at 60 fps on all ultra setting in this 7 years old game.
Have no idea how cooling system releated to power consuption and gpu load and it releated nohow, actually. It affect temp and thus performance. I don't have any problem with performance I have problem that 2017 game are burning my gpu for no reason and I cannot limit usage of my gpu for this game ('burning' does not mean very high temp but I meant that load up to 99% while chip frequency at max value which have strain gpu chip for no reason. Sorry if this is misleading). Practically I can limit fps of game but gpu usage for 85fps are the same as for 180fps and its too high for this old title (even something like darktide have lower usage than ME-A). Also 4080 noticible colder than 3080 for example, dunno about 3070. And msi gaming x have a good cooling system too.
GstRender.FullscreenRefreshRate to 60.000000 - this is already in file, line was there all time.
Last edited by prostoe_bydlo; Oct 7, 2024 @ 1:00pm
Deo Oct 7, 2024 @ 12:59pm 
Originally posted by prostoe_bydlo:
I have problem that 2017 game are burning my gpu for no reason and I cannot limit usage of my gpu for this game
It's "burning" your GPU because the game utilizes your videocard well at high FPS.

Your videocard is designed to work at 300w on 2700 mHz - that's vendor set specifications. It doesn't matter if the game was made 7 years ago or 10 years ago - if it can utilize videocard fully, it will work with your videocard at 2700 mHz which will consume 300w of power (or more).

You can limit such usage of your videocard by either downvolting gpu or decreasing FPS cap (or both). At low FPS your videocard will consume less energy, because it will require less frequency from videochip to operate data.

Try Guru3D RTSS Rivatuner Statistics Server, it can set fps cap for games too, if nothing else helps.

This is instruction how to set fps cap with this program.
https://www.youtube.com/watch?v=S0f9ePeBWjo

Also the game should consume less power at 85 FPS compared to 180 FPS, so maybe global power management mode or game specific power management mode in driver control panel was set to "Prefer Maximum Performance" instead of "Normal".
prostoe_bydlo Oct 7, 2024 @ 1:09pm 
"Prefer Maximum Performance" instead of "Normal" - already was on max performance, and I tried to change it too, doesn't work.
Thanks for the tips I try to 'play' with setting a little more but I don't really have much faith. I'm not completely noob about hardware things but this gpu situation looks very weird for me too: why andromeda specifically, not some eu5 title or not some other bugged random title.
Last edited by prostoe_bydlo; Oct 7, 2024 @ 1:11pm
Deo Oct 7, 2024 @ 1:48pm 
Originally posted by prostoe_bydlo:
I'm not completely noob about hardware things but this gpu situation looks very weird for me
"Very weird" is when the game not utilizing videocard to 100%, it means either code or engine failure of the game on development stage.

When the game using 100% of your videocard, it's perfectly normal. It actually shows that Frostbyte engine can work even with modern video cards to it's full potential.
HK-47 Nov 9, 2024 @ 1:02pm 
It's really simple.

Do not turn off V-SYNC in the game.

Even when you have a high FPS G-SYNC compatible monitor V-SYNC should stay ON. Otherwise, your video card will be wasting power rendering frames that will never be shown.
< >
Showing 1-9 of 9 comments
Per page: 1530 50