Some games are seriously lagging on Xeon E5-2690v2
It seems to happen on map segments preloading. Can it bee because of my cheap SATA SSD, PCI-E 3.0 or because AMD GPUs (mine is RX 6660 XT) are screwed up for such games? Should I change my platform, if CS2 which I play most of the time is running fine, and my PC is more than OK in working tasks?
< >
Showing 76-90 of 265 comments
David is Back Dec 22, 2023 @ 5:51am 
Originally posted by Illusion of Progress:
Yes, they might be decent for those tasks, but your thread was opened asking about gaming performance, so that's why people answered with the CPU being slow as their response.
Makes sense. Though I don't get the meaning of "might have needed its own new motherboard" part. This socket is stopped so there are no new motherboards on it, if I am not mistaken. And my motherboard is at least kinda "high end" from that time. Or did you mean the server motherboard?
Originally posted by David Is Back:
But still I ran CPU Queen on Core i5 12400F (it's only 1-2 years old), and it scored about 36000-38000 points, while my Xeon gets around 98000-100000 points. And still it is bad despite this difference?
"Bad" is relative here.

All a multi-core synthetic test does is run a particularly coded piece of software to return an incrementally increasing number based on how many times it does it in a given time span. That's it. More cores will boost this score. faster cores will boost this score.

That's only reflective of tasks (and not even all of them) that parallelize "infinitely".

Not all software works on that basis. Real time software, like games, are ones that don't.

This was answered by myself and a few others. Games care about per core performance more than core count.

So "bad" is relative, but yes, that score difference means nothing here. The single core score would be a bit more reflective (but even that can't be taken in a vacuum because core count does matter if it's too low).
Last edited by Illusion of Progress; Dec 22, 2023 @ 5:55am
plat Dec 22, 2023 @ 6:33am 
But a synthetic benchmark test should correlate with real world observations, at least to some extent. Otherwise, they would be deemed useless by the watchdogs out there instead of coming out with new versions to keep up with the changing times.

You can run Geekbench too if you want. If it is in line with the Cinebench findings, all the better. I guess the point is: to not take anything in a vacuum but correlate with other findings and observations.
r.linder Dec 22, 2023 @ 8:08am 
Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
newer Ryzens absolutely demolish those chips while using as little as 50W.
Lol, 50W, really? You must be kidding or misunderstanding something. Ryzen 5950X consumes at least twice more power, for example.

Also Intel managed to take first place again with their Core i3 of 10th generation - it was much cheaper and a little bit faster than Ryzen.
7800X3D rarely exceeds 80W and averages 50W while gaming but beats the 14700K. Even the 7950X3D only averages 50~70W in games depending on whether the CPU is set to focus on cache or frequency respectively, and that's a 16-core 32-thread chip.

i3-10100 wasn’t that fast, AMD already had faster chips with more cores, it was only relevant if you were fine with still buying a quad core. Most people weren't, AMD's offerings weren't that expensive.


Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
https://www.cpubenchmark.net/compare/4811vs2057/AMD-Ryzen-5-5600-vs-Intel-Xeon-E5-2690-v2
https://www.cpubenchmark.net/compare/2057vs4814/Intel-Xeon-E5-2690-v2-vs-AMD-Ryzen-7-5700X
Seems reasonable. But at least my overclocked Xeon outperforms Ryzen 7 2700X HT (Dual DDR4-2933, 16-20-21-49 CR1) in CPU Queen test. That sounds cool - because that's one from the second generation of Ryzens, though not the high end CPU.

And Ryzen 5600 was launched only on 4/4/2022, why are we even discussing it?
Because people are still buying it due to its high price/performance value, it’s actually worth it unlike these old Xeons, as made evident by the fact that you're here because of performance issues with a 10 year old Xeon that wasn't even intended for gaming to begin with. When the 5600 released is irrelevant.
Last edited by r.linder; Dec 22, 2023 @ 8:23am
Originally posted by plat:
But a synthetic benchmark test should correlate with real world observations, at least to some extent.
Yes, with "some extent" being the very important operative term here.

A multi-core score isn't going to reflect relative performance except for software that loads the same exact way the synthetic did.

Even "single core" scores fall under this, and this is becoming a bigger inaccuracy in the last few years due to the v-cache yet there's still a lot of people that want to think certain ones are an infallible measure of all types of single core performance. For example, some of them might test the FPU or integer hard but then they fit into a tiny cache. Great, so what happens when something is cache (memory) bound? That score just lost its meaning as a reflective measure of performance for those types of scenarios.

It doesn't mean synthetics are useless. Very far from it. It just means you should take a very hefty amount of salt with anything that tries to boil performance down to a score or few. You trade off accuracy the more you boil it down.
David is Back Dec 22, 2023 @ 10:22am 
Originally posted by 尺.し工几ᗪヨ尺:
7800X3D rarely exceeds 80W and averages 50W while gaming but beats the 14700K. Even the 7950X3D only averages 50~70W in games depending on whether the CPU is set to focus on cache or frequency respectively, and that's a 16-core 32-thread chip.
But they are really very hot (all Ryzens, not only this model) and require serious downvoltage to run under air cooling, according to customer replies on our local DNS store (there are many such replies, really, I explored these models today).

Originally posted by 尺.し工几ᗪヨ尺:
Because people are still buying it due to its high price/performance value, it’s actually worth it unlike these old Xeons, as made evident by the fact that you're here because of performance issues with a 10 year old Xeon that wasn't even intended for gaming to begin with. When the 5600 released is irrelevant.
No, it is relevant. This model just did not exist in autumn of 2018, when I first had an idea to try out a used CPU like Xeon or Threadripper from one of those benchmarks from social network groups. So let's take the CPUs that were on the market that years :)
Ralf Dec 22, 2023 @ 10:53am 
Originally posted by _I_:
ddr3 3200 cl13 is faster than any current ddr5
https://pcpartpicker.com/products/memory/#ff=ddr3&sort=-fwl&b=ddr3,ddr4,ddr5&F=6000000,8000000
im not seeing any ddr4 or ddr5 under 8ns
Even my 15 years old DDR2 1366MHz CL5 with 7ns is faster than those fancy DDR5 stuff
r.linder Dec 22, 2023 @ 10:56am 
Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
7800X3D rarely exceeds 80W and averages 50W while gaming but beats the 14700K. Even the 7950X3D only averages 50~70W in games depending on whether the CPU is set to focus on cache or frequency respectively, and that's a 16-core 32-thread chip.
But they are really very hot (all Ryzens, not only this model) and require serious downvoltage to run under air cooling, according to customer replies on our local DNS store (there are many such replies, really, I explored these models today).
It's not that extreme, most dual tower air coolers and decent AIOs handle them just fine, and the lower end chips run fine with their stock coolers.

Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
Because people are still buying it due to its high price/performance value, it’s actually worth it unlike these old Xeons, as made evident by the fact that you're here because of performance issues with a 10 year old Xeon that wasn't even intended for gaming to begin with. When the 5600 released is irrelevant.
No, it is relevant. This model just did not exist in autumn of 2018, when I first had an idea to try out a used CPU like Xeon or Threadripper from one of those benchmarks from social network groups. So let's take the CPUs that were on the market that years :)
So five years ago, pretty much the cap for when many users start considering a change in hardware, you should too, and not just to a V3 or V4 Xeon because it's a waste of time and money, whatever performance you can get out of those, you can get considerably more performance with considerably less power usage when it comes to Ryzen 9, and still noticeably less power (up to 100W or more at max load) with current i9 processors while getting 24 cores.

These Xeons may be cheap to get but they're not cheap to run, your E5-2690 v2 uses up to around 420 watts at max load and around 80 watts at IDLE. Some V3s and V4s can use over 530 watts at max load, and over 100 watts at idle, they're using as much power as an RTX 4090 for crying out loud, there were already better options back in 2018, like the 2700X which performs better in games, pretty closely in multi-core (not slower enough that anyone would care considering the power consumption difference), while also averaging around 50~60W in gaming and less than 1/3rd of the power that your Xeon uses at max load.

https://www.cpubenchmark.net/compare/2780vs3238/Intel-Xeon-E5-2690-v4-vs-AMD-Ryzen-7-2700X

https://www.cpubenchmark.net/compare/2780vs3824/Intel-Xeon-E5-2690-v4-vs-Intel-i9-10850K
Even a used 10th gen i9 would be a sizeable improvement without reducing core count but a massive reduction in overall power consumption, even at full load it only uses about half as much energy as the Xeon. There's more to cost than just the price tag, you're getting less than ideal performance while paying up the ass in terms of running cost, and if you're not paying dollars for that, someone else is.
David is Back Dec 22, 2023 @ 11:32am 
Originally posted by 尺.し工几ᗪヨ尺:
your E5-2690 v2 uses up to around 420 watts at max load
Any proofs for this? I see TDP 130 W on official page, I guess it is the maximum under load. Otherways I would have problems because my power unit delivers just 750 W :)

Originally posted by 尺.し工几ᗪヨ尺:
Even a used 10th gen i9 would be a sizeable improvement
Maybe, but they are more expensive too.
David is Back Dec 22, 2023 @ 11:38am 
Also, I have seen the test of overclocked Xeon 1680v2 (it's Sandy Bridge-EP, one generation back). Overclocking was from 3.0 GHz in stock to 4.3 GHz (though under multi-core load it reduced frequency to 3.7 GHz, its official turbo-boost frequency). And power consumption in games was just 130-135 W. It is a 8 core 16 threads CPU. And you are telling me that my 2690v2 on 3.6 GHz can reach 420 watts? :)
r.linder Dec 22, 2023 @ 12:06pm 
Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
your E5-2690 v2 uses up to around 420 watts at max load
Any proofs for this? I see TDP 130 W on official page, I guess it is the maximum under load. Otherways I would have problems because my power unit delivers just 750 W :)

Originally posted by 尺.し工几ᗪヨ尺:
Even a used 10th gen i9 would be a sizeable improvement
Maybe, but they are more expensive too.
TDP for Intel is not maximum power usage, that’s not how it works. It’s basically the power usage when all cores are running at the base clock. Look up power consumption for your E5-2690 v2.

https://www.tomshardware.com/reviews/intel-xeon-e5-2600-v4-broadwell-ep,4514-8.html

Compared to the newer Broadwell-EP Xeons it was more efficient but it’s nowhere near as efficient as modern decacore CPUs.
Last edited by r.linder; Dec 22, 2023 @ 12:09pm
r.linder Dec 22, 2023 @ 12:07pm 
Originally posted by David Is Back:
Also, I have seen the test of overclocked Xeon 1680v2 (it's Sandy Bridge-EP, one generation back). Overclocking was from 3.0 GHz in stock to 4.3 GHz (though under multi-core load it reduced frequency to 3.7 GHz, its official turbo-boost frequency). And power consumption in games was just 130-135 W. It is a 8 core 16 threads CPU. And you are telling me that my 2690v2 on 3.6 GHz can reach 420 watts? :)
Completely different SKUs and cores don’t scale the same by power consumption. It’s not 1+1 mathematics.
Last edited by r.linder; Dec 22, 2023 @ 12:10pm
David is Back Dec 22, 2023 @ 1:29pm 
Originally posted by 尺.し工几ᗪヨ尺:
Completely different SKUs and cores don’t scale the same by power consumption. It’s not 1+1 mathematics.
You still don't get the point. Each new generation is cooler that the previous ones because of the renewed techprocess. So IvyBridge EP can't be hotter that SandyBridge EP, which showed 130W under overclock.
xSOSxHawkens Dec 22, 2023 @ 1:36pm 
Originally posted by David Is Back:
Originally posted by 尺.し工几ᗪヨ尺:
Completely different SKUs and cores don’t scale the same by power consumption. It’s not 1+1 mathematics.
You still don't get the point. Each new generation is cooler that the previous ones because of the renewed techprocess. So IvyBridge EP can't be hotter that SandyBridge EP, which showed 130W under overclock.
Why are you choosing intentional ignorance and then fighting over it.

He literally linked you to professional data on it, here I will link it a second time:

https://www.tomshardware.com/reviews/intel-xeon-e5-2600-v4-broadwell-ep,4514-8.html

Scroll 1/3 of the way down, check the bar graph for your clearly listed E5-2690 v2, and witness for yourself the professionally validated power draw of the chip under actual loads and idle, which for the record again are:

83w idle and 426w load.

Please quit wasting peoples time here arguing that your chip is the best when you are blatantly ignoring publicly validated information for multiple posts, and arguing about it.
David is Back Dec 22, 2023 @ 2:24pm 
Originally posted by xSOSxHawkens:
83w idle and 426w load.

Please quit wasting peoples time here arguing that your chip is the best when you are blatantly ignoring publicly validated information for multiple posts, and arguing about it.
It can't be 426w load. What software was used in the measurement? Are you sure it's correct? Again, E5-1680 overclocked can't consume 4 times less power.
Last edited by David is Back; Dec 22, 2023 @ 4:04pm
< >
Showing 76-90 of 265 comments
Per page: 1530 50

Date Posted: Dec 19, 2023 @ 7:22am
Posts: 265