Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
1. His voltage is controlled, you're running at stock so the voltage is automatic, correct? This ties into #2.
2. His LLC is also probably set to a medium range setting instead of left on auto, so LLC isn't overshooting. When left on auto, LLC can overshoot on stock and cause higher temperatures than when LLC is controlled. More on that here: https://www.youtube.com/watch?v=NMIh8dTdJwI&t=1s Controlling voltage and LLC can improve temperatures, but you have to find a voltage where it isn't too little that the CPU won't boost but also not too much that it creates more heat.
3. Different games have different loads and are not a good overall representation of temperature. You can't compare temperatures in one game to temperatures in another, that's not a proper and fair comparison. If you want to compare temperatures under load, you'd do it with the same games and programs like Prime95 stress testing, Cinebench R15/R20, etc.
LLC should always be a medium setting and never a high or low setting to prevent undershooting and especially overshooting. If it overshoots too much it can damage or even kill the CPU.
91 C is also toaster than people here would recommend, even in Prime95. Your goal should be in the 80s, so is it really worth an extra 100~200 MHz? (Rhetorical, it isn't worth it)
My last CPU ran ~97°C in prime95 and never showed any degradation in the 5 years I had it. Sure it's toasty, but does it realistically matter as long as it's within spec? Why waste that 100-200MHz when it'll stay below your arbitrary 80°C for any realistic load (besides electricity cost of course)?
Because anything above 1.35v for a long-term OC will degrade the CPU, you won't be able to sustain that clock for as long as you think, and every time it becomes unstable you have to increase voltage to maintain the same clock. You should be going for whatever it can manage at 1.35v for the sake of being able to actually maintain the OC for years.
The extra ~200 MHz isn't going to make a big difference in most games anyway, it's not worth it. People have a bad habit of doing ridiculous overclocks for a mere few percent, not realising that they won't be able to sustain it. It damages the CPU when you use enough voltage.
That's pretty bad for a gaming temp, considering my 3900X doesn't go above the 50s while gaming with a 360mm Deepcool Castle V2, and that's a 12-core 24-thread behemoth. Just goes to show that Intel's power efficiency has gone down the crapper in the past few years because the 3900X and 9700K stock have nearly identical gaming performance but the 3900X actually uses less power despite having 4 more cores and 8 more threads.
Under heavier multi-threaded loads my 3900X hardly approaches the 80s.
Intel's 14nm is refined enough that their experimental 10nm allegedly was worse. It's basically 14nm++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++.