Cài đặt Steam
Đăng nhập
|
Ngôn ngữ
简体中文 (Hán giản thể)
繁體中文 (Hán phồn thể)
日本語 (Nhật)
한국어 (Hàn Quốc)
ไทย (Thái)
Български (Bungari)
Čeština (CH Séc)
Dansk (Đan Mạch)
Deutsch (Đức)
English (Anh)
Español - España (Tây Ban Nha - TBN)
Español - Latinoamérica (Tây Ban Nha cho Mỹ Latin)
Ελληνικά (Hy Lạp)
Français (Pháp)
Italiano (Ý)
Bahasa Indonesia (tiếng Indonesia)
Magyar (Hungary)
Nederlands (Hà Lan)
Norsk (Na Uy)
Polski (Ba Lan)
Português (Tiếng Bồ Đào Nha - BĐN)
Português - Brasil (Bồ Đào Nha - Brazil)
Română (Rumani)
Русский (Nga)
Suomi (Phần Lan)
Svenska (Thụy Điển)
Türkçe (Thổ Nhĩ Kỳ)
Українська (Ukraine)
Báo cáo lỗi dịch thuật
It was a bad choice back when you bought it in 2022. It's clear that you don't understand how hardware works, and must have went "Oh look!
A 10 core CPU for dirt cheap! That'll run anything for years to come!", without doing any research before buying it. This happens all the time, don't blame yourself, now you understand that it wasn't a smart decision.
Cores don't help much in gaming like others before me said. This is why a 10th gen 4 core i3 outperforms your Xeon considerably while sipping power. It's all about fast single threaded performance.
Still there is some limit how old GPU can be to run it, hah. I guess we need at least 4 cores and at least 2.8-3.0 GHz base core speed to run it somewhat fine, no?
Yes, that's true. Fortunately I don't work with officialy secret information atr home (and anywhere).
Thanks. That makes sense for me.
Maybe that's foolish, but I feel uncomfortable seeing such high temps (above 75-79c°). Even if it won't damage my CPU to the level where it becomes unusable (and I know about throttling protection), but anyway. Also, in summer there is too hot in my room when I'm gaming or converting game videos, and I don't have an air conditioner in my apartment.
You are saying stupid things again. I bought that Xeon in January 2022, and it was definitely 4.5 years old when I decided on which CPU I would buy (not the exact model, but that it will be a used Xeon), whuch was in September 2017. Well, even if I'm wrong and it was Septemver of 2018 - that does not make a huge difference. As A&A has already written above, there were no great improvements between 4 and 7 generations (there were minor ones). And what is more important, the game in question that I was complaining about was started being developed in 2015 and came out in summer or autumn of 2017. And it's a single game, so it never really got any updates (in case you wish to say that I did not play it when it was released but played this year).
And no, I don't own an i5 11400F, I was renting it on a time basis from a company.
Didn't you say two paragraphs before that it is normal to have higher temps in modern CPUs under full load? If there is more heat, then, by the physics laws, the consumption is also higher (because all the consumed electric power eventuially is converted into heat energy).
If you take the server/workstation segment (don't ask me why, just for fun) we'll find out that models with the same number of cores that my model has, but newer tech process and architecture (Xeon W2235 or W2245) will cost as a new Core i7 14700, event though they were first released in 2017. Surprise!
Remember that I wanted a cheap solution? So here it is, my choice.
Another option is to get Xeon E3-1680v6, but it has only 4 cores and 8 threads, and even though its cores have higher frequency in turbo-boost (up to 4.2 GHz) and slightly higher base frequency (3.6GHz) than mine... That's still awful, in my opinion. You need at least 6 cores noweadays for gaming and work, and 8 or more is even better.
But somehow I understand now what are you trying to say, yes. That things can both be true.
Chances are that the person bought a bundle with a chinese x79 motherboard, ddr3 ram and this old 10 core Xeon for a low price on something like Aliexpress and simply doesn't understand how cpu utilization actually works.
Happens all the time for those on a very tight budget that want something that is capable enough to handle games.
That's why I don't want to buy some really old Ryzen now: I'll exchange in-game stutters for in-game stutters.
And if we look at 7700 and 7700K, that were available back then - I know one streamer who I love very much and watch often, he bougt 7700K of 8700K, was very unhappy with it (he could not achieve the desired speed/quality balance with encoding on streams and the CPU was very hot), so 6 months later he sold it and bought Ryzen 5950x, and he is still using it after 6 years.
And that's why I'm not interested in 7770/8700 Core i7 CPUs either.
As for 10th generation and newer - I don't have anything against it.
But I can - playing it for the last three days. And I really don't see annoying stutters or something now (especially in PVE mode, in PvP it depends, you know, because of networking).
But yes, every game is unique: the game in question has first person view and it's graphics is not "cartoon like". So that's why it is more heavy (trees, leaves, grass, realistic urban environment).
Q3 2013. Not hard to find this information. It's 11 years old, stop trolling.
Yeap, you are totally right here, that's the case. I just multuplied core frequency by core count, and found out that I will have like x2.5 increase in performance. In fact things turned out to be not so lucky (because of lower core frequency, especially in all-core operation), but in tasks like video conversion I got even more than x2 increase, in fact (or slightly lower, like 1.8x-1.9x, I don't remember).
The problem is that my old CPU was overclocked to 4.1 GHz, and in all-core mode it dropes only to 4.0 GHz, while my Xeon 2690v2 runs at 3.4 GHz with FSB overclock and dropes to 3.0 GHz under load when more than 1 core is active. So that is the real pain and the main disadvantage.
And it you say "never compare CPU performance like that, you fool" - I will say that it's fine *here*, in the case of a single tech process and architecture and the same vendor. Because I compared Sandy Bridge i5 with Ivy Bridge Xeon (2012 and 2013 years of release), that method of calculation has its right to be in this situation.
I bring this back, because the CPU according to you is new. But even if we take the unsupported claim still this is not performing as a new CPU.
But what about 8th gen to 14th gen intel core CPUs, do they have improvements? Yes.
1. Very expensive even when used, sometimes more that desktop i5 and i7 CPUs
2. Had too many cores for my tasks and too low base frequencies (like 1.6-2.6 GHz)
3. In fact out of the first three Xeon generations only the second one had the models that suited me in terms of base and Turbo Boost frequency, that's why I chose CPU from it
So, see the point - I had an idea to try buying a Xeon and building a new platform with it as a core, but even though the real action was postponed by several years, nothing changed in terms of the affordable server CPU models (only 1st and 2nd generation Xeons became 3 times cheaper, which was even more good for me).
That said, you’re the one saying stupid things by suggesting that everyone else here is wrong about your CPU not being 4 years old. Your Xeon released in 2013, I don’t give a crap how old you think it is, it’s upwards of 11 years old and uses an 11 year old architecture that lacks important instruction sets, has low single core IPC, and doesn’t clock that high due to its core count for the time period, that’s why gaming performance isn’t all there.
So like I said, if you have the 11400F, just use that as a gaming machine and use the Xeon for other things it supposedly does faster, and stop wasting our time with your trolling antics. I literally linked Intel ARK earlier and you ignored it, it shows the launch date for that CPU, and to Silicon Valley, it’s an 11 year old chip even if it was manufactured sooner than the first batch, that’s how it’s always freaking worked, because it doesn’t matter when it was made, it’s still an 11 year old design and architecture with the same specifications regardless of if it was made in 2013, or 2023.