Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Sounds like what you need testing, is if your fan curve gets applied properly when other games reach 70°C. Try running benchmarks. GPU running at 70-80°C is normal and shouldn't cause any thermal throttling or reduce the lifespan of a GPU (I've been running Valheim on 80°C for years, and other games for decades). My 1070 runs at 70°C when FPS is not capped, and my 3060 in a laptop runs at 80-85°C.
If your GPU indeed kicks into 80%-90% fan speed, when temps reach 70°C in other non-Valheim related stress tests, then that is indeed peculiar and I have no idea why that would be specific for Valheim, as the game has nothing to do with driver controlling fan speeds. Will do a few tests of my own to see how may 1070 card behaves on custom fan curves.
Valheim will generally try to utilize 100% of your GPU when ever CPU has the room to feed the GPU, which can on some older cards be even in the main menu (even though capped at 60 by default), but usually oceans and empty areas in the game. This will make result in higher temps. You can see that even Nvidia themselves limits FPS for some games[nvidia.custhelp.com] for similar reasons, for their own cloud gaming services.
EDIT: Just tested my custom fan curve for my 1070 via MSI afterburner, and it gets applied properly. Try experimenting with firmware controls and make sure it's user defined and not automatic.
I watched many videos explaining the benefits of increasing your fancurve manually when overclocking my specific card to avoid overheating, especially RTX series. Didn't hear anything about shortening lifespan. I actually heard quite the opposite since many suggested to have the fans reach 100% when the GPU hits even close to 80 degrees. I've also confirmed that Afterburner is set to User Defined.
I personally don't believe that 60-65 degrees Celsius is unrealistic when I can stay around 55-58 degrees Celsius in RDR2 (max graphics). Who knows, I have been reading that Valheim is incredibly un-optimized.
On a different note, one thing that stood out to me, is when I had Valheim open; as a test I attempted manually changing the GPU fan speeds to 70% and Afterburner physically wouldn't register the change till I exited the game. I can't wrap my mind around this since it shouldn't be possible, let along happening out of thin air like this when everything used to run fine earlier last year. I'll try some more testing later today, I appreciate your input.
GPU Power draw in games is not measured by how games look. Check the amount of times you see your GPU being utilized to 100% in Valheim (GPU wave simulation in empty oceans), to any other environment in RDR2, and you'll notice the difference.
Think you might need to do some driver and MSI afterburner re-installing to make it work. Let me know what helped, if you manage to make it work
Interesting take on the fans, one of many opinions on that topic, and I appreciate yours. I changed my fan curve to be more forgiving inline with the temps, so the fans won't be hitting 40-50% very often.
And I'm familiar with how GPU draw effects temperature fluctuation, what I'm saying is that a year ago, I had 62-65 C when playing Valheim, now I'm getting 70-71. I'm beginning to think it's due to the newest updating adding more land and content. I reinstalled my GPU driver last night but haven't tested since, so I'll give it another shot. Thanks again.