安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
Also, you're saying that your _GPU_ is running hot, but then say that you want to keep your _CPU_ at under 70 degrees?
If you're talking about your GPU (not CPU), then the intended max temperature under load (as far as I'm aware, I haven't checked the details) is 88°C or 90°C for that generation of graphics cards.
But to answer your initial question:
The game currently does put more load on GPUs than many players expect. There's no official explanation as to why, but several possibilities exist:
- It's possible that the GPU handles parts of the simulation. That would make sense - GPUs are powerful hardware, and are increasingly used for tasks that have nothing to do with graphics. Cryptomining was the application that made this change apparent for the mass audience, but that's far from the only one. (And no, this game does not sneak a cryptomine on your machine, in case my example got you worried. ;) )
- It's possible that the game's graphics engine is currently written with a focus on making it work, and that optimization comes later. That would make sense for an Early Access product.
- It's possible that players underestimate how much graphics power a large, very detailed world with voxel graphics (houses with up to 21 floors, with every apartment having its own individual layout) requires. Voxel graphics may look clunky, but that doesn't know that they are necessarily easily easy to process in a world of that size.
All that said, however, I don't think it makes much sense to buy hardware that you don't want to fully utilize due to noise levels. I'd either buy a card with better cooling, or (if money is tight) save money by buying a cheaper card that I _can_ fully utilize. But it seems that this advice is coming years too late - might be worth making it a factor for your next purchase, though.
70deg is really cool for a cpu under load
One small thing about this: I think it's unlikely that the GPU handles world simulation only because the GPU would already be under load and the GPU wouldn't have enough spare memory for anything useful
GPUs are generally very good at doing one task at a time, even if those tasks are incredibly intensive, whereas CPUs are better for several hundred simultaneous tasks (simulation)