Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
Put a solid OC on it and they can compare to 4th gen i7's or even some of the lesser 7th gen TDP limited units.
They are slow but def still usable workhorses for 60fps gaming. As long as you are not wanting high refresh, have a good OC on the chip, and have it backed by at least 16GB of fast and tight DDR3.
If running stock with some crap timing 1333 then its gonna feel like a decade old chip for sure.
I just passed my 2700k @ 4.6Ghz onto my dad paired with 16gb of 1600/cl9 and 6500xt as a nice basic gaming rig for him to play RDR2 and such on since he moved out of state. He had a blast on it on hist last visit. Still good chips for the right uses, and a huge step up from the older macbook he was running haha.
Better put as an example maybe, I have an HTPC with a Core 2 Quad Q9550. I had a 2500K in my primary PC until two years ago, and I have a 3700X in it now. By age, the former two shouldn't feel too much different, and the newer one should be where the greatest difference is felt. Yet, the 2500K felt much, much, much closer to the 3700X than it did the Q9550. The difference between the Q9550 and 2500K was surprising, and the Q9550 wasn't exactly a slow CPU at the time either. Of course, some of that could be retroactive effects felt due to aging of the oldest one. When I went from my E8600 to my 2500K, I didn't feel like they were "that different" at first either. Once things NEED more than the 2500K it will feel not so different to the Q9550 I imagine. But it goes to show the 2500K was far from insufficient for me up until I retired it (and if you're wondering why I did then, combination of reasons, but mostly I needed more RAM).
I mean the IPC gains that Intel had going to Sandy Bridge from Ist Gen Core Nehalem and Westmere were enormous. And if you consider the Core 2 Quad line, even higher. AMD's Phenom II, and later Bulldozer/Piledriver were probably in between Core 2 Quads and 1st Gen Core. They didn't have any meaningful gains until Ryzen, which were still behind Intel until Zen2/Zen3.
Sure AMD had 6 core procs in the Phenom line, but Intel also had 6 core processors back then. Even AMD's 8 core processors in the FX line were not true 8 cores. I think what I am trying to get at is that people often have this notion that Intel didn't improve anything or have any meaningful gains in IPS or advances in architecture for the longest time, but I think that Sandy Bridge was so good that even having just little increases in IPC each gen was good enough. And they didn't just have quad cores. If you wanted more cores with their architecture, you could get their X-series procs. Mad expensive though.
Again, I think I am just trying to point out that people are always criticizing Intel and thinking they stayed stagnant for so long with no improvements, but really it was AMD that went stagnant and didn't have any improvements for nearly a decade. I mean, when Ryzen came out, they were just catching up to Intel. Sure, the IPC improvements from each generation of Ryzen have been substantial, but it still has only been a few generations. At least they finally lit a fire under Intel's ass and got them to start thinking and innovating again.
But I still think that Intel has done a good job from gen to gen with their Core processors. They might not have had incredible gains from gen to gen, but they have had a new generation of processors every year, sometimes twice in one year, and each new generation has higher IPC than the last.
Even now people criticize 11th Gen, and even 10th Gen Intel processors, and always act like AMD had this huge performance lead with Zen 3. But when I look at benchmarks, especially in gaming, 11th Gen, and even 10th Gen are right there neck and neck with Zen 3. Power consumption is much higher, but performance is there. And then there is Alder Lake.
Anyway, I don't even remember the point I was trying to make, lol, but now I think my point is that we are finally setup for an epic showdown in the mainstream CPU space come this fall. AMD is releasing Zen 4 and Intel Raptor Lake. Both vying for the performance crown, and your hard earned dollars. Should be quite a showdown, and something that we definitely not seeing a decade ago. This is how it should be. Really, you can't go wrong if you pick AMD or Intel.
I've seen vids of a GTX 1080 + 2600K with CBP 2077 at 'Low' to 'Ultra', with the latter setting showing a GPU bottleneck. At 'Low', framerate's around 60fps to >70fps looks like. With my i7 3960X + Vega64 rig, CBP 2077 is quite playable, and on my main rig (R9 3900X, 32GB RAM, RX 6900 XT) I can run the game at full bore without RT at 3840x1080.
fine for games around 10 years old, but for newer games that need faster cores it will struggle
I'm well aware of AMD's disappointments in the late 2000s and early to mid 2010s. I was singing them myself, so to speak (no, really, I almost considered AMD not an option around those time periods, and even the Phenoms were rather disappointing to me).
This entirely depends.
It only becomes coping if you acknowledge something isn't good enough but need to turn around and say it's good enough to reduce the feeling that it's not good enough.
But if you find something is good enough, then it's not coping.
I know. And that is why I said that I wasn't saying that you were criticizing or bashing Intel, but your post just made me think of how I have heard some people criticize Intel for not having bigger IPC, or architectural, gains during that time, or for not really increasing core counts on their mainstream processors. Quite frankly they didn't need to. However, while writing my post I kind of lost where I was going or what point I was trying to make.
And frankly, it was irrelevant. Especially in regards to the OP. So I shouldn't even have wrote. But to bring it back into context, sort of, lol, in 2011 I actually passed up on getting an i7 2600K, and bought an AMD Phenom II 1100T instead. It just had to do with price. I already had a motherboard that supported an 1100T, and I was able to get one for $200. I was looking to build a PC around a GTX 570 with a powerful enough CPU that wouldn't necessarily bottleneck it. I really wanted a Sandy Bridge PC because it was all the rage at that time, but it would have cost me around $350 for the CPU, and probably a good $200 or more on motherboard, so almost $600. While I was able to get an 1100T for $200. So I just did that. Sure, I wouldn't really compare a 2600K and an 1100T. But the Phenom II 1100T served me well and didn't bottleneck the GTX 570.
In 2013 however, I upgraded to a GTX 780, and this time went with Intel, because AMD was not really a viable option at that time. I did have an an AM3+ motherboard that could have housed an FX 8150, or a perhaps an 8350, but I really didn't consider those much of an upgrade over an 1100T. So I went the i7 4770K, and an Asus Z87 Maximus Hero motherboard.
That processor served me well for 8 years, and I let my father use my Phenom 1100T PC. In fact, that CPU served him well for all that time as well, and now he is actually using the 4770K since I built a new PC last year, lol.
Anyway, lol, don't really know how relevant any of this is, but it does goes to show that older CPUs still have their place and can be good enough for the needs of specific people. My father thinks the 4770K paired with a 1070 Ti is blazing fast and could care less that the CPU is nearly a decade old. In fact, he thought the 6 core 1100T I paired with my old GTX 780 was adequate. He had no need to upgrade before I let him use my old PC. The 1100T is still in use right now as a back up and business PC.
Back to what I said in my last post though, I am definitely excited to see the epic battle that AMD and Intel will be in this fall. Should be exciting. However, I am probably going to wait for 14th Gen Meteor Lake before I upgrade platform again.
CPUs have gotten a bit more exciting again lately, I agree, but I still wonder more for long term trends here. I feel like GPus are slowing down and hitting much of the same limits CPUs did and becoming less exciting, mostly because while they are still becoming faster, the actual growth at given price points (even accounting for inflation) has slowed, and power draw, heat, noise, and size are going up.
GPUs at least have the advantage of adding more "cores" and seeing a default gain, unlike CPUs, which basically have to increase architecture or clock speed to get that.