7 Days to Die

7 Days to Die

Ver estadísticas:
thecrocokyle 10 DIC 2022 a las 20:31
Intel 12th/13th Gen CPU Issue?
I'm in the midst of an upgrade from an i5-12600k/6900XT to an i7-13700k/7900XTX and I mainly play 7 Days. But first, some relavent background info.

Even with a decent PC at 1440p I was dropping to 20fps at times during what seem to be normal, boring times of the game (not particularly graphically-intense situations). I'd also randomly skip a small chunk of frames or otherwise "lag" off of a roof or through the floor, etc on occasion even though I'm running a 2 player server off of my machine for my girlfriend and I to play together on PCs in the same room. I know 7 Days is poorly optimized, but these stutters and terrible frame rates on a PC of this caliber just doesn't make sense. Until (maybe) now...

Long story short, I have been using NZXT Cam and AMD's own Afterburner software to track my hardware in game. Generally, the CPU and GPU are usually around 40-60% but the GPU will occasionally spike to 99%. This made sense assuming the P cores of the CPU are at 100% most of the time and the E cores are idling. Today, on my single player world locally on my machine (not online as to reduce internet speed variables and have a more accurate test) I used HWMonitor to log a horde night and compare the data before and after the aforementioned hardware upgrade, but when I checked the log, I was surprised at what I found. See, for those that arent familiar with the software, HWMonitor doesn't just show aggregate CPU utilization, it breaks down your individual core utilization. And what it showed is that none of the P cores ever exceeded 60% or so, yet all four E cores topped 100%. Is it possible that the Unity engine is leveraging the wrong cores for the game? If so, is there a way to fix this and force the game to use the P cores? Using efficiency cores to game on could explain a lot of odd behavior and suboptimal performance on these Intel CPUs.

Additional info:
Ram is 32GB 5600Mhz DDR5 CL40
Nothing is thermal throttling that I can see - temps never exceed 70°
Only mod I'm running is Undead Legacy and we have Gigabit internet and are running wired so I don't think it's that either for the stuttering/lagging through things
Última edición por thecrocokyle; 10 DIC 2022 a las 20:35
< >
Mostrando 1-6 de 6 comentarios
Mardoin69 11 DIC 2022 a las 1:59 
Actually, most ALL games cannot properly use the 'logical' cores....one's you've labeled 'E' cores. They only work well with the actual 'physical' cores. And, even then.....most games will not properly utilize multiple cores....definitely not anything more than 4 of them. And, most older games can only make proper use of 1 physical core. In fact, it is the operating system that is parsing the load across the other cores. The game engine's don't do that. Maybe some newer game engines are now? But, not in 7d2d's case. And, using more cores starts diminishing returns for performance. It even starts hurting performance. Keep in mind....the more cores being used, the lower the clock speed for the cpu. Clock speed is downgraded for each additional core being used.

This is why we've been recommending people setting affinity to the first 4 physical cores using task manager--details--after launching 7d2d......which is cores 0,2,4,6 (on cpu's with both physical and logical cores). I believe someone found that using the last four cores on certain AMD cpu's actually seemed to work better but, I don't remember which cpu's that was. Unfortunately, setting that affinity is something you have to do each time you launch 7d2d.....which is why I never buy i7's or cpu's using logical cores / hyperthreading.
Última edición por Mardoin69; 11 DIC 2022 a las 4:54
Dim 11 DIC 2022 a las 9:06 
Publicado originalmente por Mardoin69:
Actually, most ALL games cannot properly use the 'logical' cores....one's you've labeled 'E' cores. They only work well with the actual 'physical' cores. And, even then.....most games will not properly utilize multiple cores....definitely not anything more than 4 of them. And, most older games can only make proper use of 1 physical core. In fact, it is the operating system that is parsing the load across the other cores. The game engine's don't do that. Maybe some newer game engines are now? But, not in 7d2d's case. And, using more cores starts diminishing returns for performance. It even starts hurting performance. Keep in mind....the more cores being used, the lower the clock speed for the cpu. Clock speed is downgraded for each additional core being used.

This is why we've been recommending people setting affinity to the first 4 physical cores using task manager--details--after launching 7d2d......which is cores 0,2,4,6 (on cpu's with both physical and logical cores). I believe someone found that using the last four cores on certain AMD cpu's actually seemed to work better but, I don't remember which cpu's that was. Unfortunately, setting that affinity is something you have to do each time you launch 7d2d.....which is why I never buy i7's or cpu's using logical cores / hyperthreading.

Question is: Why devs keep creating games in that way? Why they're not updating to adapt to newest cpus?Did they expect that everyone keep playing with a Core 2 Duo or FX 4300? I'm with 12900k and have same problems as op. Terrible performance. I'm with my old trusty 2080ti and i can play even RDR2 (2k res) with maxed out settings, CP2077,etc, but 7DTD performance is for tears. I have massive frame drops even to 30. 7DTD 10 years in dev, thousands of posts about performance and save/world corruption issues, and still nothing. Dev's wont care at all to fix it, expect to advert it.No wonder latest updates are focusing on twitch again.
Mardoin69 13 DIC 2022 a las 12:21 
So, game developers create their games using Game coding engines that were made by other companies....such as the 'Unreal Engine' or 'Unity.' There's actually a ton of them. Different ones have differing capabilities so, game devs try to pick the one that's going to work best for what they want to have going in the game. But, the game engine designers have to have a pretty decent working knowledge of the varying operating systems coding in order to make them work well with 'Windows' for example. And, the Operating system is what's in charge of hardware so, the game engine has to correlate with it.

To make a game engine take control of CPU processing....it's probably very high on the difficulty scale. Most engines were originally derived back when you still needed to keep single core cpu's in mind cuz that's what a LOT of pc's had. Now days, it's almost unheard of but, wasn't that long ago a core 2 duo was fancy new tech. Long story short, the game devs are at the mercy of the game engine they choose. Even if a new game engine comes out that can do everything they want AND properly utilize todays many core cpu's......the devs are not going to want to start all over from scratch--which they'd have to do if switching engines.
Última edición por Mardoin69; 13 DIC 2022 a las 12:22
FT 13 DIC 2022 a las 15:33 
Does it run any better if you turn-off the "Efficiency Cores" and maybe also disable Hyperthreading?

I don't think there's anything the E-cores help with unless you're using the CPU for really specific thread-heavy rendering/encoding often enough for them to matter. The E-cores DO however slow down games that accidentally try to use them for anything because of their high latency and general...slow-ness.

I think the CPU scheduler is supposed to only use the E-cores for specific things (at least that's how Intel tried to make it sound originally) so game designers wouldn't need to worry about dancing in circles around something that's only existed in X86 for a couple years from one company in a few of their models...BUT that's not what happened, instead the scheduler is dumb as dirt and seems to be about as unprepared for the E-cores as the OS and games are.
ChewyChops 12 ABR 2023 a las 17:04 
I have turned off my e-cores and get better performance. My motherboard called it gaming mode to just use p-cores. (i5 13500, 3060ti gpu).
My issue is the stuttering from events such as a few zombies smashing the walls. My old 8th gen 5820k actually performed better than my new 13th gen 13500 smh.
Última edición por ChewyChops; 12 ABR 2023 a las 17:04
SiEgE 13 ABR 2023 a las 5:22 
Publicado originalmente por Jeem:
Question is: Why devs keep creating games in that way? Why they're not updating to adapt to newest cpus?
Programmer answering:
Cause it is not that easy. For some cases, it can be close to impossible, due to the weight of things already done, or things no one else knows how to edit without breaking everything. Dabbling with code or "blind diving" into changes can cause Cyberpunk 2077 type of random issues no one would know how to fix, except to roll back everything the way it was before.
Many companies don't care about "small things" like those when they can just skip it and save the development money.
Game development is a race against your dying popularity, so you have no time polishing old things. The best you can hope for is that the newest games of that studio can adapt that new technology, given that a fair amount of their clients have that hardware. Otherwise they won't care.

The "E" core tech is yet rare, and only presents on CPUs unavailable to mid-to-low cost market. Mid-to-low cost market is usually the 70% of the target audience.
Última edición por SiEgE; 13 ABR 2023 a las 5:27
< >
Mostrando 1-6 de 6 comentarios
Por página: 1530 50

Publicado el: 10 DIC 2022 a las 20:31
Mensajes: 6