Counter-Strike 2

Counter-Strike 2

View Stats:
CS2 is *still* broken on 12th and 13th Gen CPUs
TL;DR: If you have noticeable micro-stuttering on 12th and 13th gen intel CPUS, go into your bios and disable e-cores. Maybe disable hyperthreading too, but that wasn't conclusive.


This post isn't formal or refined by any means, but I think there is a useful discussion to be had here. I've been scratching my head at this strange issue for as long as the game's been out. I've come to the conclusion that CS2 (or maybe rather its specific build of Source 2) has issues handling 12th and 13th gen CPUS on certain machines.

I've tested this on two separate builds, one running a Gigabyte RTX 4080 and 12900K processor, and a second running an RTX 4090 FE and 13900KF. In both cases, average FPS for the system was well over 400 in all gamemodes while unlocked at 2160p. The first machine has been active for about a year, while the second is a fresh Windows 11 install.

So, here is the problem:
Present on both machines was an intermittent "microstutter" during online gameplay. At first, I thought it was related to network statistics because it would be most apparent while killing or dying, but looking back at my footage that seems to be just an observational error. It was definitely more noticable during online play, but local/workshop maps ere not the overwhelming focus of my testing. Concentrating on frametime, there were just seemingly random spikes of 15-20ms latency, sometimes resulting in serious visual skips. Stuff like enemies 'teleporting' around corners. As you can imagine, this is really annoying, especially when seen on great hardware.

Luckily, I got a clue about what might have been wrong from a tech forum and had the opportunity to test it on my machines. Supposedly, CS2 can't distinguish between intel's P-cores and E-cores, so it may shift processing load randomly between cores. And as fate would have it, by disabling e-cores entirely on both rigs, frametimes evened out to a virtually rock solid 2.9ms over entire competitive matches. I also disabled hyperthreading, but that was done off and on for trying other titles like Warzone. All I know for certain is that HT off + E-cores = 0 completely resolved this issue for me on the 12th and 13th gen CPUs.

While this seems like a victory, it's really not. Having to disable functionality of your machine for a game that, if we're being honest with ourselves, is essentially still just a tech demo, is not a very encouraging sign for players. I don't want to be too harsh here since this could probably be blamed on Intel, but there really needs to be a deeper dive into this performance oddity.
< >
Showing 1-15 of 42 comments
Milos.exe 'OwO' Mar 8, 2024 @ 10:54pm 
yep, I see the same problem too. My PC constantly having the stuttering problem, happen mostly random when most of the time the fps is above 140, and it's a AMD CPU, not Intel, so I think the problem is in CS2 itself. During most of playtime, even watching at the smoke or molly, of course FPS will drop, but not anywhere near unbearable, however when peeking and firing your weapon, it feels rubbish, over compensated. Every time it happens, console returns a line that said something like "clamped frame time exceed 100 millisecond... ", but most of the time, my frame time is about 10-30 ms, not that high. How the fck would Valve expect CS2 to be competitive game when the engine is so much unstable? The game is broken as hell, the player models should be pre-rendered at all time but they wasn't till the last second before engaging, while most of useless features in the map, or far away is rendered all the time - for example, I even get server slow frame time while quietly holding B site when a flash or smoke pop in A. Another example is, map Inferno has a lot of area out of bound that was rendered in full HD, too many places that you can see through, even to the deep void below the map that shouldn't be rendered at the first place. It looked good, sounded good, but didin't work.
"what you see is what you missed" :steammocking:
Mango Mar 8, 2024 @ 11:10pm 
my GeForce 5500 FX is still pulling strong.
Milos.exe 'OwO' Mar 8, 2024 @ 11:13pm 
Originally posted by Mango:
my GeForce 5500 FX is still pulling strong.
then I suspect your CPU should be jacked af, it goes to gym all the time, yes?
xD
𝓐мė𝗇𝗈 Mar 8, 2024 @ 11:40pm 
Originally posted by Professor Toothy:
TL;DR: If you have noticeable micro-stuttering on 12th and 13th gen intel CPUS, go into your bios and disable e-cores. Maybe disable hyperthreading too, but that wasn't conclusive.


This post isn't formal or refined by any means, but I think there is a useful discussion to be had here. I've been scratching my head at this strange issue for as long as the game's been out. I've come to the conclusion that CS2 (or maybe rather its specific build of Source 2) has issues handling 12th and 13th gen CPUS on certain machines.

I've tested this on two separate builds, one running a Gigabyte RTX 4080 and 12900K processor, and a second running an RTX 4090 FE and 13900KF. In both cases, average FPS for the system was well over 400 in all gamemodes while unlocked at 2160p. The first machine has been active for about a year, while the second is a fresh Windows 11 install.

So, here is the problem:
Present on both machines was an intermittent "microstutter" during online gameplay. At first, I thought it was related to network statistics because it would be most apparent while killing or dying, but looking back at my footage that seems to be just an observational error. It was definitely more noticable during online play, but local/workshop maps ere not the overwhelming focus of my testing. Concentrating on frametime, there were just seemingly random spikes of 15-20ms latency, sometimes resulting in serious visual skips. Stuff like enemies 'teleporting' around corners. As you can imagine, this is really annoying, especially when seen on great hardware.

Luckily, I got a clue about what might have been wrong from a tech forum and had the opportunity to test it on my machines. Supposedly, CS2 can't distinguish between intel's P-cores and E-cores, so it may shift processing load randomly between cores. And as fate would have it, by disabling e-cores entirely on both rigs, frametimes evened out to a virtually rock solid 2.9ms over entire competitive matches. I also disabled hyperthreading, but that was done off and on for trying other titles like Warzone. All I know for certain is that HT off + E-cores = 0 completely resolved this issue for me on the 12th and 13th gen CPUs.

While this seems like a victory, it's really not. Having to disable functionality of your machine for a game that, if we're being honest with ourselves, is essentially still just a tech demo, is not a very encouraging sign for players. I don't want to be too harsh here since this could probably be blamed on Intel, but there really needs to be a deeper dive into this performance oddity.
i got 14 gen and is working perfectly fine, is on your end
G Mar 9, 2024 @ 12:30am 
LMS Mar 9, 2024 @ 9:23am 
Let me share what happens on my i9-13900K keep in mind E-Cores OFF, HT OFF (GPU is RTX 4080). According to CapeframeX I run a 2 minute benchmark on Deathmatch Dust 2 (full server). The average FPS is 502. So it looks very good at first glance. HOWEVER, looking at the 0.1% lows my average FPS is just 120. This results in terrible inconsistencies in the frametimes (causing spikes at the millisecond level). This means the game looks like it is running perfectly fine, but feels off somehow, but not enough to be 100% certain it's a serious issue.

Now my friend runs a Ryzen 7 5800X3D (with a RTX 4080), and while his Dust 2 DM average FPS is "just" 405, his 0.1% lows average at 207 FPS. That means, his in-game experience and frametimes are MUCH more consistent (less spikes) and it is easier for him to play competetively.

What I am saying is average FPS for CS2 is about the worst metric you could possibly use, it says literally NOTHING about your real gaming experience, which is primarily dependent on the 0.1% lows, not global averages. So yeah the 13th gen definitely can be considered highly problematic, considering FPS can drop regularly from 500 to 120, without even appearing in the telemetry graph (because it happens in millisecond intervals).

But how big is the 0.1% low FPS average impact between an i9-13900K vs a Ryzen 7 5800X3D? Look at following frametime calculations.

i9-13900K - average frametimes
- avg FPS: 1000/502 = 1.99 ms
- 0.1% low avg: 1000/120 = 8.33 ms

5800X3D - average frametimes
- avg FPS: 1000/405 = 2.46 ms
- 0.1% low avg: 1000/207 = 4.83 ms

Conclusion:
On the i9-13900K we get frametime spikes from 2 ms up to 8.3 ms, while on the 5800X3D it only spikes between 2.4 ms and 4.8 ms. Clearly, Intel 13th Gen has a problem.
Last edited by LMS; Mar 9, 2024 @ 9:24am
Professor Toothy Mar 9, 2024 @ 12:39pm 
Originally posted by 🅻🅼🅷🆂:
Let me share what happens on my i9-13900K keep in mind E-Cores OFF, HT OFF (GPU is RTX 4080). According to CapeframeX I run a 2 minute benchmark on Deathmatch Dust 2 (full server). The average FPS is 502. So it looks very good at first glance. HOWEVER, looking at the 0.1% lows my average FPS is just 120. This results in terrible inconsistencies in the frametimes (causing spikes at the millisecond level). This means the game looks like it is running perfectly fine, but feels off somehow, but not enough to be 100% certain it's a serious issue.

Now my friend runs a Ryzen 7 5800X3D (with a RTX 4080), and while his Dust 2 DM average FPS is "just" 405, his 0.1% lows average at 207 FPS. That means, his in-game experience and frametimes are MUCH more consistent (less spikes) and it is easier for him to play competetively.

What I am saying is average FPS for CS2 is about the worst metric you could possibly use, it says literally NOTHING about your real gaming experience, which is primarily dependent on the 0.1% lows, not global averages. So yeah the 13th gen definitely can be considered highly problematic, considering FPS can drop regularly from 500 to 120, without even appearing in the telemetry graph (because it happens in millisecond intervals).

But how big is the 0.1% low FPS average impact between an i9-13900K vs a Ryzen 7 5800X3D? Look at following frametime calculations.

i9-13900K - average frametimes
- avg FPS: 1000/502 = 1.99 ms
- 0.1% low avg: 1000/120 = 8.33 ms

5800X3D - average frametimes
- avg FPS: 1000/405 = 2.46 ms
- 0.1% low avg: 1000/207 = 4.83 ms

Conclusion:
On the i9-13900K we get frametime spikes from 2 ms up to 8.3 ms, while on the 5800X3D it only spikes between 2.4 ms and 4.8 ms. Clearly, Intel 13th Gen has a problem.

That's a good post, but I noticed the same on a 12900k. Supposedly 14th gen doesn't have this issue.
g0x79 Mar 9, 2024 @ 12:48pm 
Found " somewhere " on internet that 4090 is not recommended to be paired with any intel (even newest gen) CPUs. I didnt know that .
They suggesting last gen AMD CPUs instead .
1000%
I have an i7 13700KF + RTX 4080

For me, disabling E-Cores makes the game worse.
Also, If I'm not mistaken, You can turn off e-cores in-game using the -threads command at startup by putting the numbers of the p-cores only.

But for me this causes a lot of crashes, in the end what I think turned out better was:

Leave all cores turned ON in BIOS, disable C-states and enable everything for the best performance... like, mem XMP, HT, resizable BAR, etc.

The thing I noticed most that improved micro stuttering and increased the 1% low FPS was activating the Resizable BAR features for CS2.exe via Nvidia Inspector.

After that, the game became much smoother, completely different.
Last edited by =PALMITu ☯∞☯; Mar 9, 2024 @ 1:12pm
Haze Mar 9, 2024 @ 1:09pm 
Originally posted by g0x79:
Found " somewhere " on internet that 4090 is not recommended to be paired with any intel (even newest gen) CPUs. I didnt know that .
They suggesting last gen AMD CPUs instead .
7800x3d & 4090 user here, current cs2 build is indeed very very nice. no more fps decrease with smokes nowadays (& mostly water)
LMS Mar 9, 2024 @ 1:50pm 
Originally posted by g0x79:
Found " somewhere " on internet that 4090 is not recommended to be paired with any intel (even newest gen) CPUs. I didnt know that .
They suggesting last gen AMD CPUs instead .

This is not correct in general. The 13/14900K are very capable of pairing it with a RTX 4090, that is not the problem at all. The problem is the Intel thread director poorly communicating with Windows 11, it fails completely to distribute the optimal load onto the P and E-Cores (optimization issue). Which is sad, because in theory this could really be a good performance booster (if optimised properly). However, in its current state, when the CPU is under load (CS2 is CPU intensive) it fails to ideally spread the load onto the available cores.

Also, keep in mind that NOT IN ALL GAMES the effects of poor performance with the i9s is happening. Unfortunately for now, CS2 in particular suffers from very poor 0.1% lows. In most other games the i9s perform really well, in fact on par or better than the 7800X3D. So the game engine design of source 2 seems to be additionally bad for the ideal core/thread load of the i9s (12/13 Gen at least)

Last edited by LMS; Mar 9, 2024 @ 2:03pm
LMS Mar 9, 2024 @ 2:13pm 
Originally posted by ☯ =PALMITu ☯:
1000%
I have an i7 13700KF + RTX 4080

For me, disabling E-Cores makes the game worse.
Also, If I'm not mistaken, You can turn off e-cores in-game using the -threads command at startup by putting the numbers of the p-cores only.

But for me this causes a lot of crashes, in the end what I think turned out better was:

Leave all cores turned ON in BIOS, disable C-states and enable everything for the best performance... like, mem XMP, HT, resizable BAR, etc.

The thing I noticed most that improved micro stuttering and increased the 1% low FPS was activating the Resizable BAR features for CS2.exe via Nvidia Inspector.

After that, the game became much smoother, completely different.

I have my system with the i9-13900K on following settings:
- XMP Enabled
- C-States Disabled
- Resizable Bar Enabled (BIOS + Profile Inspector)
- GPU in MSI mode
- ShaderX Cache Cleaned
- NVCP 3D settings on Max and High Performance, AA and MSAA all turned off
- Windows Power Plan on Bitsum Highest Performance (Process Lasso)
- Background services reduced to only the necessary ones
- Windows 11 debloated
- Process Lasso to optimize gaming performance and prevent background tasks from running

And yet the 0.1% low average does not go above 125 FPS in Dust 2 DM. There is not much more I can do here. I also tested it with Hyperthreading ON vs OFF, tested to run ALL E-Cores with -threads 9. Then tested to go into Process Lasso to set the E-Cores for background tasks but Disabled for CS2. Nothing improved the result meaningfully.

The only thing that helped a bit for the 0.1% lows was in fact using a frame rate cap at fps_max 400. This improves lows a bit, because the FPS have less margin to drop and frametimes are slightly more stable, but even then it is a marginal gain, nothing meaningful.
Last edited by LMS; Mar 9, 2024 @ 2:17pm
LMS Mar 9, 2024 @ 2:31pm 
Originally posted by Professor Toothy:
TL;DR: If you have noticeable micro-stuttering on 12th and 13th gen intel CPUS, go into your bios and disable e-cores. Maybe disable hyperthreading too, but that wasn't conclusive.


This post isn't formal or refined by any means, but I think there is a useful discussion to be had here. I've been scratching my head at this strange issue for as long as the game's been out. I've come to the conclusion that CS2 (or maybe rather its specific build of Source 2) has issues handling 12th and 13th gen CPUS on certain machines.

I've tested this on two separate builds, one running a Gigabyte RTX 4080 and 12900K processor, and a second running an RTX 4090 FE and 13900KF. In both cases, average FPS for the system was well over 400 in all gamemodes while unlocked at 2160p. The first machine has been active for about a year, while the second is a fresh Windows 11 install.

So, here is the problem:
Present on both machines was an intermittent "microstutter" during online gameplay. At first, I thought it was related to network statistics because it would be most apparent while killing or dying, but looking back at my footage that seems to be just an observational error. It was definitely more noticable during online play, but local/workshop maps ere not the overwhelming focus of my testing. Concentrating on frametime, there were just seemingly random spikes of 15-20ms latency, sometimes resulting in serious visual skips. Stuff like enemies 'teleporting' around corners. As you can imagine, this is really annoying, especially when seen on great hardware.

Luckily, I got a clue about what might have been wrong from a tech forum and had the opportunity to test it on my machines. Supposedly, CS2 can't distinguish between intel's P-cores and E-cores, so it may shift processing load randomly between cores. And as fate would have it, by disabling e-cores entirely on both rigs, frametimes evened out to a virtually rock solid 2.9ms over entire competitive matches. I also disabled hyperthreading, but that was done off and on for trying other titles like Warzone. All I know for certain is that HT off + E-cores = 0 completely resolved this issue for me on the 12th and 13th gen CPUs.

While this seems like a victory, it's really not. Having to disable functionality of your machine for a game that, if we're being honest with ourselves, is essentially still just a tech demo, is not a very encouraging sign for players. I don't want to be too harsh here since this could probably be blamed on Intel, but there really needs to be a deeper dive into this performance oddity.

To be honest I would say leaving HT ON in BIOS actually improves the fps lows a bit (E-Cores OFF). I just re-tested it on DUST 2 DM for each 2 minutes using CapeframeX and repeated each run twice. Results (cleaning ShaderX Cache before each run);

HT ON & E-Cores OFF
- 0.1% FPS avg: 122
- 0.2% FPS avg: 154
- 1% FPS avg: 210
- avg FPS: 498

HT OFF & E-Cores OFF
- 0.1% FPS avg: 112
- 0.2% FPS avg: 144
- 1% FPS avg: 198
- avg FPS: 476

So, yeah leave HT ON or test it for your system what gives better results. In any case, you are right that the 12th/13th Gen is still broken for CS2.
JudasPhysicist Mar 9, 2024 @ 2:35pm 
Originally posted by Mango:
my GeForce 5500 FX is still pulling strong.
Mmmm support for Table Fog. That bad boy can play Thief 1&2 allright.
M ツ Jun 3, 2024 @ 9:23pm 
WILL THEY EVER FIX INTEL CPUS ON CS2 MY GOODNESS 1% LOWS SUCK
< >
Showing 1-15 of 42 comments
Per page: 1530 50

Date Posted: Mar 8, 2024 @ 10:13pm
Posts: 42