Instalar Steam
iniciar sesión
|
idioma
简体中文 (Chino simplificado)
繁體中文 (Chino tradicional)
日本語 (Japonés)
한국어 (Coreano)
ไทย (Tailandés)
български (Búlgaro)
Čeština (Checo)
Dansk (Danés)
Deutsch (Alemán)
English (Inglés)
Español - España
Ελληνικά (Griego)
Français (Francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (Húngaro)
Nederlands (Holandés)
Norsk (Noruego)
Polski (Polaco)
Português (Portugués de Portugal)
Português - Brasil (Portugués - Brasil)
Română (Rumano)
Русский (Ruso)
Suomi (Finés)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Informar de un error de traducción
But in a vacuum, the top tiers are faster on Intel's side (more complicated on AMD's with the X3D CPUs). You asked which was the best (I took this to mean performance), and I was answering it factually.
Sick of this AMD is bad nonsense. Easy peazy fix and most Ryzens need a manual change in SOC anyways. The first generations SOC on some boards was too low at 0.9v which was causing random BSoD as Auto on some boards were not working properly.
With new chipsets you see many weird things so do not count out AMD just yet because Intel has been stagnant with their i-series CPU's for quite some time (just ask their investors).
Personally, I am more concerned about the motherboard manufacturers.
It's responsible for some voltages to stuff besides the cores themselves, like the memory controller being one of the prime ones, and maybe some other stuff on the I/O die. I believe it is sort of analogous to the system agent (VCCSA) voltage on Intel.
But I don't think anyone said anything about the SoC voltage issue on AM5 nor AMD being bad in this thread? OP was simply specifically asking about Intel. And yes, the AM5 SoC issue was rather overblown and sometimes still is, because people didn't want to educate themselves on it. It was amusing how a particular couple of people on this forum were even calling the CPUs flawed and in need of complete recalls simply because they apparently should have been able to take 1.4V+ to begin with like Intel supposedly can, when 1.4V+ on the VCCSA on some of Intel's CPUs is also ill advised, and has also been known to be enough to degrade its IMC over time.
Those 13th gen CPUs don't offer anything meaningful and they give you more ecores, which I think is bizarre and a waste.
Intel 15th gen Arrow Lake which will be 3 nanometers and Zen 4. Both are expected to have huge uplifts in IPC compared to current gen and refreshes this year.
Only if you don't mind lower 1%'s by a decent margin and forget to overclock the Intel chip.
A 360 aio will cool a 13900k just fine, but the 14900k is due very soon, which is basically going to be 13900ks.
As cos cs2, anything mid range up will reach your goal, a 12700k/f can be had very cheap (as low as $200!) and is probably the best value fast chip.
My 13900k barely touches the mid 70c range, but, it is delidded with a custom copper ihs in a custom loop.
Changing to a Z790 and a 13700K brings the combo price up $150 to $549.
Of course they also run a ton of combo deals on AMD stuff, but the 12th generation processors in particular seem like they are making a hard push for high value right now, even if they aren't the fastest. I guess there might be a lot of them in stock and with the 13th generation out and soon 14th generation coming (and then new platform soon too), Intel seems to want to move them.
Meanwhile 13900k's can run at 5.8 to 6GHz and make use of 8000MHz ram with decent timings.
Out of the box and with no tinkering the AMD offerings are very good, especially the 5800X3D, but, they do suffer from the AMDip and in most cases have notably worse lows, atleast when compared to properly setup systems, alas, the majority of tech channels, and sites can't tune a system to save their reputation.
I dont trust ANY YouTubers and neither should any of you, worst advice.
These are all blatantly paid for their PR... these guys do their YT stuff only for money.. same goes with Twitch, Insta and all other UNNECCESSARY "social" brainwashing platforms to spread out fake stuff.... for money... on a high quality background. Your eyes are tricked, so your brain and mind... are manipulated.
And then there is a random lucky one which becomes known and earns some pseudo-reputations, non-sense fame, and so even more money.. by selling planned and controlled stuff ermmm ads ermm informations.. many try their "career" there, some are lucky and so become known, as these are simply "upvoted" by their investors.. and people fall for this trick ALWAYS, and in an instant a streamer has 10k viewers, followers, or whatever..
I highly dont support this RIGGED BS..
Any YT links posted count as an intrusive spam, an attempt to brainwash people. I NEVER click on any YT links, acting as it is a spam mail.. deleting and ignored.. a plaque!!
YouTube is a worse source of information.. to brainwash people, spreading misleading informations, plausible fakes, attempts to sell their stuff, i cannot stop to repeat this! it is nothing but an advertiser to sell their products.. nothing less than a marketing campaign in a big brainwashing size.. moderated by baby-face models.. ANY ads without exception is brainwashing stuff, especially for kids..
i would trust daily news on TV even more..
Best source of knowledge is to gather intel from many different sources and YOU compare stuff YOURSELF to make so the cut or an average but satisfying result. to reveal the real truth.. you gain experience YOURSELF... same goes with news.
dont blindly listen to others, that is the cancer of society.. this is the way to keep people controlled and stupid, to "keep" them even more stupid.. to leech their money for own profit.. hundreds of years ago there had been books to spread out BS, today there are "social" platforms.. yeah yeah.. alone that title "social platform" is brainwashing.. man, do some effort yourself.. !!
it is like: in our country there are smartphone ads solely for iPhones, Samsung and even Xiaomi. I dont buy these, also plainly by principle and experience. There is nothing about other brands. People forget these other brands. How about SONY for example, HTC, ... nothing about them, for many years already. Nothing on TV ads, no any ads or even (package) offers online, or on newpapers.. A pure controlling and brainwashing marketing. YOU need to seek for them and to compare YOURSELF.. SONY is high-quality.. and not such "mainstream" cheap BS you need to have to exchange every second year or to repair every month...
Streaming games, music and media, so manuals for various stuff on these platforms is an other story..
good luck..
Here is a high level comparison of two cpus -
https://www.cpubenchmark.net/compare/5022vs5299/Intel-Core-i9-13900K-vs-AMD-Ryzen-7-7800X3D
The most important attribute of a cpu is the single thread speed. That's how fast a cpu thread can execute a thread of code. It differs by application. And programs aren't single threads of code. But it's a good indicator.
Single Thread Rating 4667 versus 3767
That's a massive difference. And why the 13th gen are the best cpus currently. Starfield cpu hierarchy -
https://i.imgur.com/4hKogzU.png
The chart does show fps scores, which I said aren't the best indicator of cpu capability. So you also need to look at the detailed stats.
Here is the chart again with the 12900k and 13700k. The 12900k and 13700k look like the same thing. But look at the TDP and CPU rating. Cpu performance is also dependent on the generation as shown in the Starfield slide. 13th gen is more advanced than 12th gen.
Here are the detailed scores of the 13900k and 7800X3D -
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i9-13900K&id=5022
https://www.cpubenchmark.net/cpu.php?cpu=AMD+Ryzen+7+7800X3D&id=5299
Floating point math - 154 Gops versus 62. That's over twice the throughput.
Another thing to remember is that gpu performance does not scale linearly. The monitor resolution impacts the hierarchy. This means that 1080p cpu benchmarks using 4090's can be slightly misleading if you aren't planning to use a 4090 at 1080p. It's a very minor point but always, when looking at gpus, evaluate at the resolution you intend to play at. It can be important for mid-range gpus. And look at the ancillary performance like DLSS, FSR etc. So going back to where I started, there is more to it than just fps.
Whatever Passmark is measuring doesn't gain anything on the X3D CPUs in that particular measurement, whereas the X3D CPUs demonstrably perform better than their non-X3D counterparts in real world games results.
Go look up the 5800X to 5800X3D. or the 7700X to 7800X3D.
https://www.cpubenchmark.net/compare/5299vs5036/AMD-Ryzen-7-7800X3D-vs-AMD-Ryzen-7-7700X
https://www.cpubenchmark.net/compare/4823vs3869/AMD-Ryzen-7-5800X3D-vs-AMD-Ryzen-7-5800X
Why are the X3D versions slightly worse? Because that single measurement doesn't gain anything from cache, and the the X3D versions have a slightly lower clock speed then their non-X3D counterparts.
Yet, it games, this is demonstrably not how things stack up. At all. The 5800X3D will actually have 1% lows as high as the average of the 5800X (extreme case, but it shows how far apart they can be). The X3Ds have a wide range of performance and that's what makes these "single core" comparisons really muddy when involving them. Something doesn't benefit from the cache? The X3D will performs just below the baseline of the generation it is in. Something does benefit from the cache? The X3D might perform a full generation or even two higher. This is why the 5800X3D is actually placed above the Ryzen 7000/Intel 12th generation in games on average by outlets that run them through test suites in games and average them out.
In other words, what you're looking at with that particular measurement is literally a worst case scenario for the X3D CPUs. Likewise, Starfield seemingly doesn't benefit from them either, hence they're not matching Intel in that particular game. Unless you have a multiple things to test to average out a place for them, the X3Ds won't have an accurate reflection of where they stand. Any single benchmark or game result or hypothetical spec is going to be either worst case scenario or best case scenario for them and thus dishonest. Problem is, few outlets have constructed large enough suites of stuff to do this (Tom's Hardware[www.tomshardware.com] is one that has).
That's why any single spec in a vacuum can't be taken as a measurement. Usually I'd agree with you on a single core measurement being an indicator (but only an indicator at best), but that goes out the window with the X3D CPUs in particular because most stuff that measures them either benefits or it doesn't, so you're only ever going to get a worst case scenario or a best case scenario. Passmark's single-threaded rating doesn't gain from the cache, thus you're seeing a worse case scenario for it. And Passmark knows all this themselves. It's why they now also have a gaming hierarchy which is separate from their "single core" ordering.
https://www.cpubenchmark.net/top-gaming-cpus.html
That being said, the X3Ds have high rankings due to being averaged out there. They can have much higher highs, but also lower lows. So if you want something more consistent, yes the 13th generation would be "above" the 7000 series X3Ds. But on average, that typically doesn't remain true.
Because I'm a platform manager in a global telco/media/paytv/cloud company, rated as one of the top 10 IT practitioners worldwide. Cpubenchmark's data is consistent with benchmarking data provided by probably the best global IT consultancy, which costs a bomb to get access too.
..... and you're not?
So pls post here whatev global telco/media/paytv/cloud company bs results you've collected.
I also work in nasa and i found cpubenchmark results are not consistent with benchmarking data provided by the top nasa aerospace engineer, which costs a bomb to get access too.
13700k on Intel side 7800X3D on AMD side. For CS2 Intel seems to win maybe it will change after beta? I wonder why they dont use the cache more personally i went for 7800x3d.
Thats going to be my rig plus 500hz monitor.