Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
That the driver's own fps limiter fails is ominous, because driver settings overrule game settings. No game gets past the driver. Games only interface with DirectX, which interfaces with the driver. So it is either a user problem or a problem with driver settings.
Running it on an RTX 3060Ti and a GTX 960 cool as a kitten btw.
For almost half year I never saw any title that make my gpu draw over 310W... Andromeda is truly a galaxy here.
I suggest you try NVIDIA App, which is still in beta, but will replace GeForce Experience. Use it to optimise the game's settings and see if it helps.
https://www.nvidia.com/en-gb/software/nvidia-app/
If your GPU is overheating, it's GPU's manufacturer's fault - not the game's fault.
Good GPU never overheats, because it has good cooling system. It's pretty obvious.
If your GPU has bad cooling system, set fps lock to 60 in driver control panel.
Monitor Technology - Fixed Refresh.
Preferred Refresh Rate - Application Controlled.
Vertical Sync - Off.
In Documents\BioWare\Mass Effect Andromeda\Save go to ProfOpts_profile
open it with Notepad and edit GstRender.FullscreenRefreshRate to 60.000000
My 3070 gives me 57C temperature delta at 60 fps on all ultra setting in this 7 years old game.
GstRender.FullscreenRefreshRate to 60.000000 - this is already in file, line was there all time.
Your videocard is designed to work at 300w on 2700 mHz - that's vendor set specifications. It doesn't matter if the game was made 7 years ago or 10 years ago - if it can utilize videocard fully, it will work with your videocard at 2700 mHz which will consume 300w of power (or more).
You can limit such usage of your videocard by either downvolting gpu or decreasing FPS cap (or both). At low FPS your videocard will consume less energy, because it will require less frequency from videochip to operate data.
Try Guru3D RTSS Rivatuner Statistics Server, it can set fps cap for games too, if nothing else helps.
This is instruction how to set fps cap with this program.
https://www.youtube.com/watch?v=S0f9ePeBWjo
Also the game should consume less power at 85 FPS compared to 180 FPS, so maybe global power management mode or game specific power management mode in driver control panel was set to "Prefer Maximum Performance" instead of "Normal".
Thanks for the tips I try to 'play' with setting a little more but I don't really have much faith. I'm not completely noob about hardware things but this gpu situation looks very weird for me too: why andromeda specifically, not some eu5 title or not some other bugged random title.
When the game using 100% of your videocard, it's perfectly normal. It actually shows that Frostbyte engine can work even with modern video cards to it's full potential.
Do not turn off V-SYNC in the game.
Even when you have a high FPS G-SYNC compatible monitor V-SYNC should stay ON. Otherwise, your video card will be wasting power rendering frames that will never be shown.