Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
when something consumes double power , should be avoided ! Here comes the need of better cooling - CPU fans , liquid cooling , extra case fans , extra bucks for everything even for electricity bills . So , better be silent .
edit
I quit the thread cuz i dont want to accumulate anger . I said what i wanted . There is Internet , any one can check for himself/herself . Good luck !
AMD is already going head to head with Intel server chips and wiping the floor with Intel. My next system will be a WRX80 series system.
https://www.youtube.com/watch?v=e96mboM155g
The only reason the chips aren't dead is because of the FIT which scales voltage based on current drawn by the CPU from the VRM and the CPU's temperature.
Power consumption by the CPU has a very low impact on your power bill unless your CPU is running high usage literally constantly. When both the 11900K and 5800X are running at 25~50% CPU usage for 8 hours per day, the running cost per year is only different by around 2~7$. For a significant difference to be made, you'd have to run both chips at 100% usage 24/7, which still only makes a difference of about 44$ per year.
You preach about something that doesn't make a difference, and the power draw difference is in intense loads, not even gaming loads. In typical gaming loads, Intel doesn't draw much more power than AMD, they're actually relatively close.
When comparing the following systems using x CPU and a 2080 Ti, these are the power draw results for the whole system, while gaming:
Gaming:
1. 3700X - 365W
2. 3900X - 385W
3. 3600 - 368W
4. 10600K - 383W
5. 10700K - 389W
6. 10900K - 395W
7. 10700K @ 5.1 GHz - 407W
The difference is negligible.
Exactly why Monk said earlier that you know little about the technology you're championing. You call us shills, yet I'm literally an AMD user and Monk only cares about the best bang for the buck, and right now, Intel 10th gen gives a better bang because Zen3 costs too much and Zen2 doesn't perform well enough for the prices. And where those CPUs are actually being used, the power draw doesn't make much of a difference at all. It makes more sense with CPUs that are being used for very heavy CPU loads extremely often, but for gamers? Doesn't make a freaking difference.
Nothing good from AMD Ryzen cpu is below $300+
Stick with NVIDIA gpus which are just better all around and use a lot less power then Radeon. Then it won't matter what power your cpu draws.
As Ive said, once 10th gen dries up, I'll be championing AMD as the best deal, until then, for 99% of people the 10700k is the best gaming cpu for the money by a wide margin.
But if this is going to occur, I can guess as some reasoning why. The Ryzen 7 3700X, and even the Ryzen 7 5800X, seem to have been sneaking a bit more down in price (~$280 to $310 in the case of the 3700X, and the 5800X can often be found for ~$40 to $50 below MSRP), to where the Ryzen 7 3700X and the Core i7 10700K are both approaching parity in overall value. I wonder if Intel is wanting to put the squeeze on to keep it the clearly better option.
If Intel drops the 10700K to that point, the 3700X would need to be around the very same $250 (or maybe a bit lower than that even) that I got mine for a year ago in order to make much sense. It would also make the 5800X more of a hard recommendation again, which is a shame as you can pick one up for around what the 10700K originally retailed for. That's the other interesting thing; imagine picking one up for $409 and watching it drop to $269 in around a year.
Intel is sort of doing to the 5800X here with the 10700K, what it had done to that very same 10700K by the 3700X a year ago. I know Intel switched to proving a better value a while ago, but if such a further price drop occurs, the extent of which they stand would ALMOST mirror where they stood a year ago, as opposites. Quite amusing.
^ Take this. Disable Turbo and C1E, set the Base Clock to around 4.8 - 5.2 Ghz
Enjoy the same performance as the KF $310 model for less $
ps: the next interesting thing on the gfx narket will be the intel gamer gpu's. then there is amd, intel and nvidia on the gfx market (not just nvidia and amd). just to mention. that is what i am looking for.
good bye
As far as I know, Intel is done with 11 generation so it's highly unlikely to get any better. From what I've read, 11 generation is a very marginal side-grade from 10th generation. I find the various published data on 10th gen cpus very impressive.
Now the Alder Lake has my attention with a totally new architecture of the cpu. Here's one source for that info--I try to avoid stuff that hypes the product too much in advance. It actually tempts me to upgrade from Coffee Lake and that generation ain't too long in the tooth, not yet anyway.
https://en.wikipedia.org/wiki/Alder_Lake_(microprocessor)
Current primary
Core i9 10980xe
ROG rampage Vi extreme
64gb 3600mhz
Dual 3090 ( suprim x )
4tb Evo 980 pro
ROG Thor 1200 platinum x2
Independent custom loops 1 360 rad per unit (cpu, 2x gpu)
I was hoping the 11th may be a bit better if they brought out some huge processor like the XE of the 10th.
You wouldn't need a K model unless you plan to put high quality liquid cooling on it and OC it well beyond 5.2Ghz or so.
But usually the K (or KF) model is the 10700 people refer to so I presumed that was the one you were.
Surprised you went with such slow ram and such mediocre cooling tbh if you are going to thst extreme.