Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
Then you got a bad board, not a bad design. Things like that happen. Also happen with Asus and AsRock and others (I have 3 dead Asus high ends in the house now). Bad luck on ur part.
You can claim they have bad VRM's all you want, but on the high end boards their lineup has just as good if not better VRM's. Yes, at launch, on about 3 models of x570, they sucked, aside from that they have a pretty decent track record historically. They are not always the coolest, but they *do* often offer some of the best in class OC potential, with their boards regularly outpacing Asus or others.
You can claim they suck, but I will take my Meg x570 Unify over any asus board any day. Same for my Z97 Mpower (another world record OC'er) or my Z77A-GD65, all of which serve me wonderfully (and all of which run quite cool might I add). The two intel ones are great OC'ers too, pushing i7's to 4.8 daily /5.0 bench (Z97) and 4.6daily (z77). I havent pushed x570 yet, as Im on the Wraith and dont need more performance yet.
I have had too many assu products go belly up to ever trust them again with a major purchase. Not one, many. Only major tech vendor that has had multiple failures over the course of the past 20 years. Every one of the major tech companies have failed me at least once, and a few twice in that time, but asus... Every one of their products has eventually died (except my second hand GTX-670 thats still kickin...)
Also, specific to this last bit. What a joke.
Aside from MSi...
Gigabyte makes fine boards which also OC quite well.
AsRock and Asus *also* make a good fair number of crappy boards (and some good ones).
lol
I'm literally sat here with a Gigabyte Auros that claims '4400+ MHZ ram OC' on the box, but wont even OC to 3300 Mhz ram as my OP already explains. Same ram does 4000 Mhz on Asus boards.
Thats nowhere near fine. I doubt you have any experience of enthusiast level products or overclocking in general, you probably couldn't even afford it.
I specifically stated 'Asus STRIX or better' and 'Asrock FATALITY' as they are the tried and proven boards for overclocking, I didnt state that they don't make lower end products. Gigabtye Auros doesnt overclock anything, and MSI godlikes blow up in under 6 months.
A £450 motherboard from MSI or Gigabyte is the same as a £150 motherboard from Asus. Neither MSI or Gigabyte even know how to make something of the same tier as Asus Strix and upwards or Asrock Fatality.
It was a widespread problem at least 20% of their motherboards were having and everyone was raging on their forums (same boot error codes across all their 'high end' motherboards, and MSI did nothing to investigate or explain the issue. Stop defending ♥♥♥♥ tier products.
Sounds like you should have gotten RAM known to work with your board. Asus has kits it doesnt like too. I have used multiple Gigabyte boards over the years across multple generations of DDR standards, and all of them have run any RAM kit used, incluing high performance low lattency, with the only single exception being a no-name kit from china. Thats just cheap RAM for you. You might not have had cheap ram, but that doesnt mean that the kit was going to play nice with ur board. Should have read up on ur board and bought a kit known to work well. As you should with any board. From any maker.
Also, lolz on the no experiance with enthusiast parts... FFS man, I run a 3900x in my main and the motherboards in my top two rigs hold multiple (and more) overclocking records than anything you are using from Asus... But you keep talking smak there bud. I will enjoy the fact that I sold *off* of Intel x299 to jump to x570, and revel in the *fact* that my budy who bout the Asus x299s Prime from me and then spent 900$ on a second hand i9 12c, then delided it and LM it, and OC'd it under a high end aio loop STILL cant beat my stock 3900x in any CPU bound benchmark single or multithreaded....lolz
People are saying it b/c its now true. Perhaps you havent paid attention in the past couple of weeks, but here, let me sum it up a bit for you...
Ryzen 3000/Zen2+ has been shown in *multiple* tests to be about equal to Intel parts at gaming in a *clock to clock* comparison, while trashing intel in anything else beside gaming...
They *only* way Intel has held a lead in games is b/c of their high core speeds, and yes in certain specific build cases, mainly now in the lower-mid range gamer builds, they still hold some marginal dominance with the likes of the 9700k... Barely. In most cases by sub-10% while lacking in non-game areas by more.
But on the mobile front AMD now matches and beats intel in every way, including games now with the R7/R9 4K chips. But production or real work is even more a thrashing for Intel...
https://www.youtube.com/watch?v=ooz7ozw-lpo
https://www.youtube.com/watch?v=fs55aPczQos
In lower end desktops, the new R3-3300X is such a beast that at stock it ebats the 7700K and anything before it, and with good overclocking, due to the better clock for clock, it can match the 9700k in games despite being a chip with the cores/threads.
https://www.youtube.com/watch?v=5JE0JPeahK4
https://www.youtube.com/watch?v=Sq0OHhRQwA8
https://www.youtube.com/watch?v=vD8Yk7JrBL8
And on the high end AMD trashes intel in all ways that would be important to someone wanting 10+ cores, even if you lose 3-5FPS in games.
And in the *ultra* high end (read 28-21c minimum) AMD will hands down give you more for less at every turn, even up to and including Epyc vs Xeon.
https://www.youtube.com/watch?v=JwETLRMandY
(remember from the HWUnboxed testing above we know that clock for clock, watt for watt, and core for core the AMD offers more in every productivity suite app aside from DPF edditing... So their gains are not just in video edditnig but in most real work style loads)
And even in that one specific place mentioned above where Intel holds a specific lead in games (the lower end i7/upper end i5) there are *plenty* of equally priced AMD options that will offer substantial benefit in non-gaming loads at a minor penalty to gaming, something many people would rather have than a weaker overall system at the benefit of games.
So yeh, thats why...Intel only has purpose in one small segment of the consumer space, and even there it is hotly contested if its worth sacrificing 10% in games for 20%+ in other areas...
Sounds like you should learn how to read, I already had the ram, I required a board that works for it.
If you don't even know what Micron E Die is which clearly you dont, then stop trying to act like you know anything, it the second highest overclockable DDR4 IC after Samsung B Die across any motherboard.
Also the reason why Gigabyte boards dont overclock ram has nothing to do with the ram itself. They use 'T Topology' which is optimization for when 4 ram slots are filled, but this severely limits any high frequency overclocking with just 2 sticks used. Oddly I have a board with only 2 ram slots, but in hardware software it is still detected as having 4 dimms because they never changed the hardware to be optimized for only 2 slots and it still features T Topology. Asus Optimem II on te other hand is fully focused on achieving maximum frequencies from one channel filled whihc is exactly what overclocking requires, and this is something that neither Gigabyte nor MSI have even the slightest clue on how to achieve.
Gigabyte T Toplology is known to severely hinder ram overclocking period.
Also the rest of your post is all about mobile and not desktop chips, and your overt need to appeal to authority with all those videos shows that you yourself dont know anything. Literally every actual benchmark on the Internet shows universally across the vast majority of games, Intel 9700K+ are still better. Using 'clock for clock' isnt a valid excuse when AMD chips cannot even match Intel's clocks in the first place, you actually have to underclock an Intel CPU to achieve that.
Keep being jealous that you cant afford a fast CPU, ram or anything, and that you have absolutely not the slightest clue what you keep on talking about. Feel free to carry on buying ♥♥♥♥ tier MSI or Gigabyte boards that cannot overclock anything, or blow up within 6 months of trying.
Stick to your AMD because you dont know how to overclock or how to achieve overall greater performance from tweaking everything, not that you even can in the first place as you use a rubbish motherboard as a base.
I post vids because unlike you Im not a troll and I back my claims up.
I posted vids about destop parts too, you just chose to ignore.
I posted vids about server parts too, you just chose to ignore.
I clearly stated I use high end parts, and each and every part I have from every generation is high end for its respective time, going back to my Pentium MMX build, yet you ignore.
I personally run Samsung B-Die in my secondary rig and Mircon on my main rig.
You keep on thinking a 3900x w/3600Mhz cl16 RAM on x570
OR
A 4790K w/2400Mhz CL10 on Z97
Are low end. Sure the i7 is old, but both of those builds represent high end for their respective times, and one of them is in the time of NOW. lol
Im calling it. Ur a troll. Go home under your bridge or bring some real arguments, cuz most of ur data is out of date bud.
Intel is *solidly* in their FX era...
You look about as silly now as AMD fans did then if they said FX was some amazing i7 killer. I like AMD, I even likeFX and still think its usable, but it is not great and never was. I can admit the flaws that AMD had in their FX line, both then and now...
You seem to be unable to do the same now that Intel is on the bottom taking it so hard... lol
Its time to back yourself up, or pack it up. You talk trash, but dont source, then discount sources from others. Where are yours? You make such a big deal about ur high end parts and experience, lets see it. Wheres ur HWBot profile with ur rigs and benches bud? Wheres that CPUZ link to that 10980XE you are rocking already cuz you are *so* high end?... Wheres those CPUZ validations on those multiple previous i9s?...
do you even bench bro?
lolol
I might not have access to 10K$ rigs. Never claimed I did. But I do have a rig that most on here would class a decently high end. Wheres urs? What do *you* run that is sooo amazing.... Cuz even if it bests mine in one way or another there are always bigger fish in the sea than you ;)
Oh, and some actual advice... If you *know* you will be running two sticks... And if you *know* you will want to run them at speeds that are stupid with no benefit outside of E-Peen...
Maybe buy a board made for it to begin with instead of ones made with 4 stick RAM configs in mind...
Many people (myself included) like to run 4 sticks, either immediately or in the end as EOL upgrades. To me, having a stable 4 stick board at good OC rates (for ddr4 to me thats about 3600/3800, anything higher is a waste of time and money) is far more important than a board that runs 2 sticks at speeds that dont matter... Gigabyte boards are not bad just cuz they target a different stability and performance factor. To many consumers maxing capacity at good speeds > than maxing speeds at lower capacity. Both are equally high end. I would argue that 4*8GB 3600 cl16 is far more useful than 2*8 4000 cl16. Specially on Ryzen where you hit a 2:1 IF ratio past 3600Mhz. Same in a 4*4/3600 vs 2*4/4000. Even on intel though, gains are marginal at best, if there are any gains at all.
LMAO. Troll confirmed.
Also 4790K @ 5.0 Bench, 4.8 24/7, and locked 4.0 @ 1.0v undervolt. (MSI Z97)
Also 2700K @ 4.8 Bench, 4.6 24/7 on air (MSI Z77)
Also (for intel)
Q6700 @ 3.6 w/1866Mhz FSB (air) (Gigabyte P45)
Pentium 4 3.0E @ 3.9Ghz (Asus i865PE)
Pentium D 2.53 @ ~3.5 (same asus board)
Not mine personally, but additional experience mentoring and helping those I know on both 6700k and 7700k.
Also plenty of AMD side OCing from about every generation, now including Ryzen (limited tweaking so far).
But no... I dont know how to OC at all...
https://steamcommunity.com/sharedfiles/filedetails/?id=1648465904
(old result before the 2400 cl10 samsung bdie kit)
Either way, we and back on track...
10900K is a joke...
https://wccftech.com/intel-core-i9-10900k-10-core-cpu-hot-power-hungry-at-stock-benchmarks-reveal/
https://www.tomshardware.com/news/intel-core-i9-10900k-stress-test
Nobody runs stress tests 24/7 you absolute clown. Thats like trying to run furmark 24/7 on a high end GPU and then complaining when it dies a week later.
No CPU or GPU is ever made with the intent to be ran permanently at 100% load.
Funny that you post nothing but OCing results for CPUs when my issue was purely about OCing my Micron E Die ram, which I know as a fact Gigabyte motherboards featuring t topology cannot and will not ever do, and I dont need to provide links when you can simply google just how bad t topology boards are for ram overclocks and go ahead and read 100+ links and experiences confirming this. Gigabyte boards cannot even get any DDR4 ram to 3800 Mhz, yet continue to falsely advertise that their products are DDR4 4400 Mhz+ capable.
Wait a second, why exactly running at 100% is a bad thing?
Run intel burn test for a whole month and tell me if your CPU survives.
Because NOTHING outside of stress tests will ever do this.
The actual point of more cores is to REDUCE the strain on each core. This is why 4 core processors become more efficient in mobile hardware than 2 core ones, but if you were to run them at 100% load, your phone would literally melt or blow up.
When I got it I had nearly 1TB of 4K Raw backlogged. It spend nearly every hour of every day its first week alive stressed to 100%. And I didnt stop the encodes just to game, I simply took 4c/8t away from handbrake and jumped into game, so it litteraly spent upwards of 5 days loaded.
After that, and while it was still cool here, I put both my CPU and GPU to work folding for covid, though I have stopped now that warm days are here as I dont need ~500w dumping into my front room 24/7 anymore.
Just had the comp crunch through ~14hrs (CPU time, not raw footage) of 4K night before last, and had it render out another 5hrs worth today.
So yeh, some people *do* stress their rigs out like that for days on end. I see no reason not to be able to. If every system I have ever had can do that on air, why cant the 10900K?...
Heck, even my 3900x holds a ~4.0 boost under full loads on all 24t and thats on the included stock air cooler... How far can you get with the included Intel cooler with that 10900K?... can you even hold stock BASE speeds, much less boost?... Oh... wait... You didnt even get one. Better get out that wallet again ;)
What do you mean by efficiency? Also, I don't see how running 2-core mobile CPU at 100% load would melt the phone, but running 4-core mobile CPU at 100% load doesn't. Or do you think 4-core mobile CPUs never reach full load?
Funny how you want to do nothing but bash on motherboards and talk about RAM OC'ing in a thread that is about CPU's and the 10900k...
Kinda where you would *want* to be looking at things like OC results, thermals, performance/watt, etc...