Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I mean, I'm sort of simulating the effects of lowering settings or framerate by limiting maximum voltage and clock speeds of my CPU. The only difference is, I'm still getting the same performance. The real issue is that the CPU is NOT being utilized very much, so the temps shouldn't be this high. Setting limits in BIOS or XTU fixes the problem 100%, but its just not normal to have to set voltage and clock speeds to limited levels to play a game, and especially not normal that doing so has ZERO effect on the performance. Meaning it never had any reason to go that high in the first place.
IDK, maybe its a hardware/system level thing with my PC, but i don't have race to halt enabled or any settings like that, and it doesn't happen with every game. So, the game is at least playing a PART in the issue.
Actually, i did notice while monitoring through geforce overlay, that the game did have a bit higher than average CPU usage, but limiting the CPU or reducing clocks/voltage doesn't change the usage. You'd think that essentially simulating having 50% of the CPU would make the usage level of the game increase, but it doesn't. So, seems there is something weird going on.
Yeah, yeah...
Honestly thinking about going AMD next time, or going down to an i7 (or core 7 ultra or wtf ever the new naming scheme is). This thing can get so hot and power hungry, it's technically OKAY, but annoying AF.
Still, it's abnormally so in this game, and limiting the voltage and clock speeds doesn't have any effect on game performance, at least at a locked framerate. It DOES, however, MASSIVELY decrease temp and fan speeds. So, it apparantly never needed to be running at that power level and speed. And looking at CPU usage, gimping my CPU by like, half, you'd think it would leave to a higher % CPU usage, but it also doesnt.
I mean, i can literally just set a specific CPU config and limits, but that's annoying, and I have change them very time i start/finish playing.
Actually, hold up... Is the game stuck at 60fps, regardless of your actual cap? Is THAT why it looks so choppy and janky, even when i tried raising the cap and it didnt fix it? Thought it was just like jerky/weird camera movement, or something was wrong with my system at first, but then i kinda got used to it. But now, after just seeing something about it, it DOES feel like its just at 60fps.
there is an issue where the game will not go above 60hz refresh rate. which makes it a jittery mess for people like me on a 2k / 165hz display. the only way to somewhat fix it, is locking ur fps to 60fps until the devs fix it.
Yeah, i did that. I dont like going so low with framerate, but in this case it literally made no difference. Regardless of how high my OSD was telling me my framerate was, it seems to be 60 no matter what, but thermal performance took a ♥♥♥♥ when it was locked on anything above 60, and i was apparently getting no benefits, either.
Reminds me of when Valheim launched. Game ran hot as hell too. But I don't remember it pushing my cpu so high in usage.
i7-3770 overclocked to 4,68GHz
while running the game CPU utilization is averaging at 70% while temps stay firm at 60°c, aircooled in Antec P101.
There's too much overthinking here, solving your heat dissipation issues will be benefiting you longer than most of these games lifespan.
Especially the lighting is clearly subpar and i mean incredibly bad.
You can have 90fps but once daytime starts and you look at the sun, it drops down to 20.
Open the map, zoom out and it drops down because of the fog effect, for me from 100 fps to 27.
In general shadows and light effects are badly done. They glitch out too, i get constantly flickering shadows and light sources/rays.
As of now, you gain the most performance from tuning shadow effects down, which the game has a lot of.
Sadly the shadows are also badly optimized.
Once they tweak the lighting and fog effects, it will become more playable.
My CPU does tend to run a bit warm, but it's just the nature or my setup, compared with an overpowered CPU for my needs. Solving it isn't that simple.
While my CPU is usually on the warmer side, it's not to the level that this game gets it, and it's not exactly my system that is the problem, maybe just my CPU IN GENERAL, since it's rather power hungry, but repasting or something isn't going to solve it.
How I DID solve it was modifying voltage and clock speeds for my CPU. Overall, I cut my power and performance of my CPU by about 50% in terms of voltage and clock speed limits. Literally cutting my CPU in half had absolutely ZERO effect on performance. My CPU utilization was never really over 15% on maxed settings at 90 fps lock, so whatever the game is doing, or however it was trying to utilize my CPU, it was really inefficient for no reason, since cutting the game off from being able to utilize my CPU like it was, had no effect in terms of performance.
It just wanted to push clock speeds to the maximum for some reason when it has no business pushing a 24 core/32 thread CPU to 5.7GHz, when I'm not running super high frame rates at low graphical settings/resolution. I'm playing on maxed settings and framerate locked to 90. That's not to say it constantly runs at this speed, and this is where my setup can play a part, but it spikes clock speeds quite often, so it warms up.
I don't think it's just the game in general causing the issue, I think it's just the way that the engine utilizes CPU resources. It's just not very efficient, and asks for far more resources than it actually needs/can use, or I'd don't know. I don't know how exactly their engine functions, but whatever it's doing it shouldn't be, and is unnecessary.
The fact that placing limits on the game's resource usage has no effects other than a 50% reduction in thermal performance means something isn't right. I found that I can decrease clock speeds from 5.7ghz max to 2.7ghz max before and reduce power from 175w max down to 30w max, before I start noticing any performance impact. And now temps only increase like 12-14° above desktop temps, and I'm still only averaging around 15% to 20% CPU utilization at this level.
I mean, I know new i9 CPUs suck for efficiency and thermal performance out of the box without tuning, but this game really takes the cake for being on the more simplistic side, yet pushing my CPU as much as the most cutting edge simulation game. Just booting up the game causes temps to spike and fans to start screaming. Temps are under control now, but its annoying that the game can run with such great efficency and performance at a low spec, but only if I tune it to that level.
Just open nvidia/adrenalin control panel, look for the fps limiter option for the game.. turn it on.. PROFIT!
It's like a 15 second fix.
My GPU usage will go from around 55-60% to literally 99% for like 10 seconds the moment it turns night/morning. I always know when it's about to happen, because I'll hear my fans spin up right before.
I also noticed that while shadows (except global sun shadows) look really good, they als work strangely. Instead of shadows being the ABSENSE of light, they are the ADDITION of dark. By that I mean if you go into an area that has partial darkness, or already cast shadows, and you wield a torch and get an object between you and the pre-existing shadow, instead of that area staying the same darkness but more lit where your torchlight lands, the shadow your torch casts will add additional darkness to wherever it's being blocked. Like, if you put a fire near a tree, and then walk near that tree with a torch, the fire shadow will get darker if the tree casts a shadow from your torch on it, too.
So, I can see where that might cause a performance hit if the game is essentially rendering shadows multiple times in the same spots if there are multiple light sources.
I did that as well. I had the game locked to 90fps, but apparently the game only refreshed the image at 60hz, and having it not locked to 60 had a pretty big performance impact. That was for my GPU, though. My CPU was only running about 10% utilization to begin with. Locking fps had a SLIGHT impact, but CPU temps were still way higher than they should have been, and the game kept wanting to push my clock speeds to max and draw a good bit of wattage. I was able to lower my clock speeds from 5.7ghz to 2.7ghz, and wattage from 175w to 30w before it started having any performance impact, and now my temps only raise a few degrees from desktop temps.
My point is that this game can still perform exactly the same when I cut my CPU speed and performance in half. So, why is wanting to push my CPU so hard and inefficiently when it's totatally unnecessary? It can't even utilize the performance it's asking for without set limits. I'm sure it probably also has to do with how my CPU is built to function, and its just not very efficient out of the box, but it's not like this is something that happens with every game. Even so, I found that I can pretty significantly limit performance and have lower temps in general for everything, with no negative/performance impact. But rarely did anything generate this level of heat, even when the CPU was allowed to run at whatever capacity it wanted.
Imagine carrying around a $20 bill. Normally you'd just put it in your wallet/money clip/pocket, because that's all you need. Now imagine having a Brinks armored truck follow you around just to carry that $20. That's essentially what this game was doing to my CPU until I forced it to "use a wallet".