Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
edit: only difference I can think of, and this is baseless speculation, is sudden boosting of few cores causing temp spike which wouldn't happen in even 100% stress load. And even then temps are well within normal limits. Modern components throttle somewhere round 95-105 C and anything below is basically okay, especially if not constant and prolonged.
Probably because it's loading the map and shader cache specifically needed for the at the same time. Other games do their shader caching when loading the game, but this one didn't do that which tells me it's loading the needed shader cache for each map during map loading and during cut scenes. It's actually a smart way to do it.
You are in fact, not right. You are a complete clown spouting complete and utter bullsh!t.
Having 90 degree temps at 35% CPU and GPU usage when much more demanding games cause no such problems is not a """COOLING ISSUE""".
It's an issue with this cancerous UE5 joke being optimized like trash and running poorly on expensive hardware it has no business running poorly on.
Do us all a favor and shut up for good.
Turns out it was a driver issue all along.
They had a hotfix driver update for the issue available on their support forums all the way back on April 15th, but it didn't make it to the Nvidia app or the main driver download pages until a few days ago.
And the devs did no optimization on Epic graphics mode. They optimized the game for consoles, so there is almost no visual difference between High and Epic settings. Set your graphics to High. Your eyes won't be able to tell the difference unless you're a pixel-hunting dork but your GPU will thank you.
But that said, AMD says 90°C is perfectly fine for their ASICs (both CPU and GPU) and they're by default set up to eat electricity and spit out clock speeds until they hit that temperature. If that number in temperature bothers you and you're okay with forgoing diminished returns in clock speeds then set a lower temperature limit.
I have a 5800X limited to 100W PPT and 75°C with -15CO and the real world loss in performance is negligible. Compiling stuff in Gentoo still has the CPU reach ~4.6 GHz on all cores like stock and games usually never touch these limits anyway.
Similarly I limited my RX 7600 to 2.4 GHz because it would otherwise just run at 90-95°C hotpsot temps with only 1-2 extra fps in some of the more graphics heavy scenes where fps drop below 60 using the same settings. It uses ~100W electrical power to run at 2.4 GHz but for an extra 200 MHz boost it needs another 60W average. It's a bit ludicrous.
Software can absolutely cause your system to heat up, because software is what gives your hardware instructions—and those instructions can demand way more power than usual.
Here’s the bottom line:
Stress test tools like Prime95, FurMark, AIDA64, OCCT, etc. exist specifically to max out your CPU, GPU, or both—and they heat your system way more than typical real-world games or apps.
A badly optimized game can hammer your hardware unnecessarily, especially with uncapped framerates, unthrottled physics threads, or inefficient draw calls.
Power viruses are a thing. These are workloads that are crafted just to push thermals and power draw to unsafe levels—some software can do this by mistake, others on purpose.
So yes—software can make your computer dangerously hot. It's not just about the cooling—it’s about the demand being made of the hardware by the code.
Go install Prime95, run the torture test and then come back and say he's wrong. Simple fact is the other games weren't utilizing your CPU to full load the way the game is. That's not a problem with the game, it's just your cooler isn't cooling well. Mine peaks around 55-60 through the entire gaming session.
Not to mention several other players in this post who share the same temps only in Clair Obscur like me, go tell them they are wrong.
Also, as I said few comments ago I did run Prime95 already in the past days, the range of temps I saw was 55-78. And that's a stress test, eons far away from what a game like Clair Obscur (which is graphically good but nothing special compared to kingdom come 2 or Wukong) may ever bother