Установить Steam
войти
|
язык
简体中文 (упрощенный китайский)
繁體中文 (традиционный китайский)
日本語 (японский)
한국어 (корейский)
ไทย (тайский)
Български (болгарский)
Čeština (чешский)
Dansk (датский)
Deutsch (немецкий)
English (английский)
Español - España (испанский)
Español - Latinoamérica (латиноам. испанский)
Ελληνικά (греческий)
Français (французский)
Italiano (итальянский)
Bahasa Indonesia (индонезийский)
Magyar (венгерский)
Nederlands (нидерландский)
Norsk (норвежский)
Polski (польский)
Português (португальский)
Português-Brasil (бразильский португальский)
Română (румынский)
Suomi (финский)
Svenska (шведский)
Türkçe (турецкий)
Tiếng Việt (вьетнамский)
Українська (украинский)
Сообщить о проблеме с переводом
"It should've stayed that way"?
Out of curiosity, how hard is it for you to deal with reality? Because this sounds like the sort of mindset that would explain Userbenchmark.
Companies do not belong to be what you want them to be.
You're also very young or very new to tech (or just have a failing memory). The relationship in the early 2010s that you refer to between Intel and AMD was not how it always was. AMD was outperforming Intel at times in the 1990s and 2000s as well.
Pentium 4 2.2GHz <<< Athlon XP 1800+
with a 14900k anything above like a 12700k at a higher res is pretty much a draw come back to reality
until gpu's catch up all upper end cpu are a draw
and keep you eye on the 285k it may surprise you
thats what amd fanboys liked to think at the time.. and was only caused amd like released their new products JUST before intel (but with both effectively hitting the shelves the same month) so that they could use that 3 day release gap to test against intels old products + cherrypicked the games to bend the truth too.
no ever from first amd.. (as I still remember they were split of from intel in a way and first amd and intel shared same socket..) from those first amd cpus all the way upto the last pentium 4s intel was performance and stability king.
granted.,. intel did stick a bit to long on single core cpu's in the 00s.. and with those am atlon 64 cpu's they did get a lead due intels stubborness to go multicore.. but that lead was ended as soon as intel released core2duo...
and than intel remained on lead until the launch of the ryzens
which ended a decades long good thing..
People were saying the same thing about Bulldozer against Sandy Bridge too. "It doesn't matter because nobody plays at 720p with a GTX 580" they said.
Yeah, that's how you measure the difference between CPUs. You don't measure them in cases where the GPU is the bottleneck. That doesn't give you the REAL difference.
Now if you don't think the fastest CPU is needed right now because it'll be "wasted" since the GPU is usually the bottleneck and will mask that difference, then sure, that's a reasonable opinion. Settling for a slower CPU is fine. But telling yourself it's not slower to cope just because your GPU is masking the difference doesn't change the fact that it is, in fact, that much slower.
Also, CPU limited games exist.
Also, that added CPU performance that is masked today will show up tomorrow when your CPU lasts longer. Most people actually don't replace the platform/CPU often, so having a stronger one that lasts through two or three GPUs is nice. If you're buying more frequently, then yeah, settling for the value brand (Intel) is just fine.
Also, GRLDS ("Gradual Raptor Lake Death Syndrome") continues to exist. People in Silent Hill 2/other Unreal Engine games complain about crashing and surprise surprise, it's almost always a Raptor Lake CPU. We're going to be seeing this for YEARS where people have those CPUs failing and blaming the games. I love the people who told themselves this was "just a small issue". Your own fellow gamers are out there having a poor experience and not knowing that their CPU is causing them their issues (this should have been a recall), and instead of admitting it's the problem that it is, you'd rather defend a billion dollar corporation to preserve your feelings?
Oh, it surprised me all right.
well to be fair.. if the gpu is the bottleneck.. than the cpu does not matter..
but even without it being the bottleneck.. if one cpu gives lower 1%, better 50% I call that the better cpu for gaming (even if it maybe have 1 or 2 fps on average less..)
I mean even when you pair a very old gpu with an new cpu.. while it already has a cpu that is 10x more than it needs.. you sometimes see 1 or 2 fps increase in games.. not cause it was bottlenecked.. but there is likely SOME part of that game still aided by a more modern cpu..
for both amd and intel futureproofing is essentially dead anyway..
I mean in the pas if you bought the best gpu say an 4790k and paired it with the best cpu.. say an 980ti.. well than it could handle 1080ti and 2080ti titan rtx.. and perhaps even 3090ti with no issues..
(thoygh the latter will be a little bottlenecked)
point made you could do an upgrade by just swapping the gpu.
but now the best cpu 7800x3d and best gpu 4090 are basicly both maxed out.. meaning a 7800x3d won't allow you
7800x3d + 4090
7800x3d + 5090
and
9800x3d + 4090
all run basicly equal..
only if you upgrade BOTH to the latest model.. you see actualy significant gains.
=====================
ofcourse not everybody buys a pc that high end.. but even at lower end.. will you pair the most expensive cpu with a low end or mid end gpu? that makes no sense budget alocating wise.. your more likely buy something like a 7600x instead or even an 5600x so you have more budget for your gpu and a better bang for buck today at the expense of not being able to swap only your gpu but having to swap out both in the future..
in such a market.. for those who buy not the most higherend 4090 or 5090.. but a more midend model.. intel cpu's can work just fine to support that gpu.,. and might at times be even the cheaper option.. (though ofcourse amd still supports sockets longer so upgradability will be cheaper on amd)
yes intel IS behind.. which is why it makes no sense in buying intel today.. but it is not as far behind as fx was to intel once..
anybody buying an fx in 2016 was basicly insane.. even intel celerons outclassed that even the cheapest i3 outclassed the best fx..
but while intel is behind.. and uses more power.. etc.. it is not THAT far behind.. its more the radeon of the cpu market now.. and if you really really want to.. well it can still do it;s job in midend and lowend... only you pay with a much higher electricity bill too which is why that not works for natiosn were powerdraw is a factor..
ofcourse the recent durability issues.. well.. those have not helped...
it is like people buying apple products
I call them insane.. you overspend many times for what can be had so much cheaper and better..
but they are not inherently bad products just bad for the pricepoint you get them at..
and sure intel always had a bad bang for buck pricing.. but that was defendable when they were also better... which they are no longer.
But...
"CPU A and CPU B deliver similar enough performance most of the time since GPU X slows both of them down"
...Does not equal...
"CPU A isn't slower than CPU B"
They are different things, and yet certain people are telling themselves those are the same statement to cope, probably because they are too insecure to admit they are settling for a slower CPU because it doesn't matter much.
(This, again, ignores that CPU limited games exist, and that a faster CPU will last longer before becoming the bottleneck and needing replaced, so it actually does matter, even if it's not in a massive frame rate spread today.)
But this all ignores something important. Anyone who's spent the last many years buying Core i9s on repeat is already demonstrating that CPU performance does matter to them. Because if it didn't, they would either be buying a cheaper CPU, or upgrading the higher end one less often. Their actions don't match their words.
It sounds a bit similar to the rants about why the RTX4090 has 450W TDP compared to the GTX1080Ti which is 250W and how expensive the cards have become or the electricity bills are, but the guy buys the video card because he has the money and he's lazy to downclock and/or undervolt because he always finds an excuse in performance or something else.
Really... What does the consumer want and why is it self inflicted with this, I don't know and don't get it :D
If you have to lie, and bend reality to claim nobody does thing "x" to so "y" doesn't matter then you don't really have a point.
For one thing, people do buy high end machines and play at 1080p because they prefer fluidity over resolution. Theres also the issue of AAA games netting ♥♥♥♥ performance at launch and having GPU head room mitigates this in some situations.
Second twirling around waving a wand and screaming "4k!" doesn't make Intel's poor CPU performance disappear.
Plenty of games can be and infact are still CPU bound even at 4k. Tarkov is legendary for its dependence on CPU performance/CPU cache regardless of resolution.
Even at 4k just about every esports or wannabe esports game is likely to be CPU bound.
And as scalars get better and games get more complex you'll be seeing MORE CPU bound scenarios popup.
And your comment about the 285k is weird and cringe. Like, surprise us how? Stupid low scores from bugs and odd power behavior (likely due to Intel trying to obfuscate power draw) aside it performs exactly how Intel introduced us to it which is WORSE than previous benchmarks for older products (which are now invalid as post "fix" numbers are lower).
Not sure why so many awkwardly try to re create reality and claim the company who is behind is somehow magically ahead.
Like, when you guys saw the 14900k have 3 more FPS at 1080p in a 600 FPS average it was all "Its the best and I only buy the very best) now its somehow "the best doesn't really matter so instead of buying a lower end part from the better company I'll buy from the losing one... for more money...".