Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Edit: And if you are currently playing single player, launching your map as a dedicated server and then connecting to that dedicated server, instead of launching the map from within the game, will increase performance substantially. Effectively makes the game use an extra 4 threads.
https://www.nexusmods.com/7daystodie/mods/3107?tab=posts
Not sure what Bios options you may have on your rig but, some motherboards... you can change a setting in BIOS to disable 'logical' cores / Hyperthreading. Otherwise, you could change the cpu behavior for 7d2d by setting affinity to 4 'physical' cores... which is 0, 2, 4, and 6... on an I7 or any using both physical and logical cores. Unfortunately, you'd have to do this every time you start-up 7d2d. Just start Task manager, then load up 7d2d. Switch back to task manager--details--selecting 7d2d.exe by right clicking it... you can 'set affinity' to those cores. Then you can exit Task Manager and switch back to 7d2d. The system will now be using 4 cores while playing.
Game engines work poorly when the cpu is using 'logical' cores--which are actually just a second thread for a physical core... more of a 'virtual' core. Game engines also only address the primary core (the first physical core) and the operating system is re-directing the workload (splitting it up) between the other cores. What happens is diminishing returns... the more the workload is split and spread across cores, the more performance hit the process takes. Thus, through testing, we find that the trade off point is at 4 cores. Actually 3 but, this leaves an extra for the operating system doing it's own background crap. So, 4 cores while gaming is best.
Also, be sure Windows Power profile is NOT lowering cpu power. It's minimum power state should be 100% when in 'advanced settings' for the power profile. Or, just use the 'High Performance' power profile which by default already sets minimum cpu power state to 100%. Same goes for Network Adapter--if playing MP. YOu don't want it lowering power state to NIC or MP games will suffer lag and dropped connections.
Any modern OS worth anything will be able to schedule noisy processes intelligently even with SMP (hyperthreading) enabled and without jumping through hoops. Game engine doesn't request a specific core (by default), the OS schedules it wherever there are underutilized cores (sometimes CPUs have a handful of cores that can boost to a higher clock rate when running single or extremely lightly threaded applications, the OS will schedule busier single threaded applications to these cores first but they only stay boosted if the other cores aren't being utilized... This is, unfortunately, not likely to help with the game because the cutoff is usually 2 cores).
There are no diminishing returns at 4 cores, it becomes more difficult to program with more cores and the 4 core paradigm was set for video games at a time when most people had 2-4 cores. Other engines, applications, etc, are programmed to utilize additional cores and pull of a lot more complex tasks than this game does. There are third party map generators (Teragon) that will use as many cores as you can throw at it for some tasks, they will make better maps in less time that the generator built into the game. Unity has decided on a type of game that they want to cater to, first and foremost, and it's not 7 Days to Die. Devs prefer Unity because it's easier to churn out code but the trade off is that performance is crap. Switching to an engine that holds their hand less but offers better threading, better AI pathing, better graphics, and better lighting (UE5) would be the intelligent thing to do but it's never ever going to happen.
The power profile would have to be in maximum power saving mode to impact game performance and he'd have seen a higher CPU utilization rate. He can leave the power settings configured to balanced, it's not going to net him any additional performance. The only time this suggestion would help is if he was fluctuating between 100% and less than 100% frequently.
Pretty much your options are, if you are hosting (single player, hosting a multiplayer game through the game client when playing with friends), you have to switch to a dedicated server to relieve pressure on the 4 cores the game gets to use. If you aren't hosting and you're still getting not great performance, you're kind of SOL (though I've heard that Win11 makes all games play a bit like garbage, not sure if it's still true but there was a sizable performance hit when running games on Win11 compared to Win7/10).
My setup and what I get are as follows:
CPU: i9-7960X 16 Core 2.8Ghz (Liquid cooled, custom loop)
RAM: 32GB (4x8GB) DD4-4000
SSD: Samsung 980 Pro 2TB
GPU: Nvidia RTX 4090 (liquid cooled, custom loop)
Performance:
Hosting in game client:
Running around in the woods with 1 friend playing: 30FPS (max)
Running around in the city with 1 friend playing: 8-17FPS
Dedicated server:
Running around in the woods with 2 friends playing: 60FPS (max my monitor supports)
Running around in the city with 2 friends playing: 45-60FPS
For what it's worth, other games made in Unity have the same problem. I can't play Cities: Skylines with the music on because the FPS dips and it stretches the audio out (who decided to tie music playback with FPS?)... Unity is just a garbage engine that's meant for light projects and getting your feet wet but the devs never refactored their code over to a different engine (most devs won't, it's a ton of work).
I didn't just pull that info off some stupid google search. I've actually tested these things with 7d2d. And, NO, you don't need a maxed out power saving mode in order to replicate the issues between server and client when it comes to the NIC or gpu for performance. Just the simple routine of Windows trying to power down the hardware AT ALL causes a performance hit and / or lag between server and client. Everything I said is true and you maybe mis-interpreting what I was describing or reading not enough / more into what I said than needs to be. And, YES... with actual testing for 7d2d... the optimal number of cores for best performance is 3... then leaving an additional for reserve (like I mentioned) for a total 4 cores FOR BEST PERFORMANCE. Cuz, there IS diminishing returns on performance with anymore cores enabled.
Edit: @Flint... sorry this thread seems to be getting derailed. As to your issue...
When nothing seems to be helping, I usually chalk it up to a faulty install of either game and / or gpu driver. Since it's not really that difficult or time consuming, I usually just wipe and do a clean install to further test and see if the problem is resolved. You'd be surprised how often that seems to do the trick.
For the game, just copy your
C:\users\your user name\Appdata\Roaming\7d2d\... Saves and Generated Worlds map folder for that save... to a temporary location. Then uninstall and go through the pc deleting left behind folders in steams install location and place I just mentioned along with ...\Appdata\LocalLow\The Fun Pimps\ folder. Restart, load steam, install game, verify files, start the launcher, switch to 'tools' tab, select 'Clean...' checking all the boxes, and clean the old data. Switch back to main tab, disable EAC + Analytics, select 'Run and Save...' Then go through options in 7d2d and optimize settings. Close it out, copy your save and map files back, launch 7d2d and test.
For gpu, download latest driver, then Uninstall the current driver. Now run the installation of the new driver. Then do a system restart. You don't want to just 'update' the current driver cuz that can lead to driver corruption--which will cause odd performance issues in more demanding games. You'll need to go into your GPU's Control Panel and optimize it's settings afterward. Test game again.
Look kid, I've been in the minutia of computer and network performance for decades. It's what I do for a job, every day. I stand by what I said above. Yes, the processor putting cores into a deeper C-State can cause a performance hit when it has to bring the core back out of the C-State to execute work. No, you aren't going to see a performance hit if that's the case if you are sitting at 30% processor utilization (maybe you can measure a 0 to 1% performance difference but you aren't going to FEEL that >1% performance difference). The performance hit is more pronounced at higher utilization rates because you have work queued to be done so if you are bursting from 50% to 100% and back, over and over again, you will feel a bigger difference than if you are constantly sitting at 30%.
The optimal number of cores for 7 Days to Die is 4 because of a limitation in the game engine. Having more cores will only hurt performance in so far as higher core count CPUs usually have a lower per core clock rate. All things being equal, the number of cores in your system beyond 4 will not have a deleterious impact. The game COULD run with 16 cores or 32... or 64... or scale with the hardware... and you'd absolutely, 100,000,000% get a better experience. The point I was trying to make above wasn't that having more cores will help this guy, it was that the devs made a design choice that guarantees that this guy will never see the performance he's targeting with the hardware he has. In fact, the same design choice that is hamstringing his performance is the reason that the devs cut the max render distance to about a third of what it was in A20 and changed the zombie spawn mechanic to have zombies flit into existence when you enter a room. They aren't going to fix the issue, it's only going to get worse, but that's where the problem lies. The best thing OP can do is whatever he can to relieve pressure on the 4 cores that are processing the game. That means firing up a dedicated server if he's not already playing MP on someone else's server (and if they are hosting, they should be running a dedicated server for the same reason).
The threads not getting derailed. I've given him all the advice there is available to help his issue (I have similar hardware, I know where the pain points are and how to mitigate them as much as he's going to be able to). You just give bad advice.
Actually, you're not saying anything different than what I've said... except arguing semantics and telling me I'm wrong about performance impact of power saving modes... which I'm not wrong about cuz I've tested it and it DOES make a 'noticeable' difference across various hardware configurations (various--not ALL)... mainly when playing MP. But, that's what I'm referring to as the thread getting derailed... you arguing semantics... KID!
I've been building PC's for 'DECADES' as you put it and doing benchmark testing with various builds... (also have degree in Computer Administration)... and may not use correct lingo all the time but, potato pototo. So, before you go calling someone 'KID' grandpa, maybe you should quit assuming everyone on the internet is some punk kid in mom's basement that only knows what they read off Google search, while you think you know everything! I'm done debating with you on a stupid support topic where my points were to help them achieve better performance NOT mess with the likes of Mr Know-it-alls.