Dyson Sphere Program

Dyson Sphere Program

Ver estatísticas:
chen.zephyr 27/fev./2023 às 19:15
Which CPU will be better? 13900KS or 7950X3D?
I am working on a challenge about 125K universe matrix (target for 1HR burst) in 10% resource world.
Now, I had finish 80% working, the major working currently is keeping expanding mining.
My platform is AMD 3950X + RTX 3080Ti + 128G DDR4 RAM
As all the very late game in DSP, the FPS is terrible....
Now, my FPS is only around 6-8 (the CPU usage is around 60%)
So far, almost my mining machine, interstellar station, factory are out of grid and power off, in order to save my precious fuel rod.
That's means the FPS will become worse when I start the 1HR burst challange.
So, if I want to update my computer, which CPU will more suitable for a high demand late game in DSP?

==UpDate==
I had already update my system from AMD 3950X to Intel 13900KS

My current computer setup are
CPU:13900KS
Memory: DDR4 3600 128GB
GPU: MSI 3080ti 12GB
SSD: Samsung SSD 990 PRO 2TB

In mid-game, the FPS improved from 3x to 5x
But in late game, 200K universe matrix per minute, the FPS dropped to 5

I upload my save file
https://drive.google.com/file/d/1FD5VZkFiPSr3P-G64Snxz7OJIc-1MWdJ/view?usp=sharing
Please share your FPS~~ and your computer hardware (LoL)
Última edição por chen.zephyr; 5/ago./2023 às 4:08
< >
Exibindo comentários 3145 de 47
I have 64G ram ddr6 myself. A few times seen some high ram usage in games. But ya dont think over 32g yet.
Wow 10FPs on my system as well. All settings set to defult
Computer Information:
Manufacturer: Gigabyte Technology Co., Ltd.
Model: Z790 UD AC
Form Factor: Desktop
No Touch Input Detected
Processor Information:
CPU Vendor: GenuineIntel
CPU Brand: 13th Gen Intel(R) Core(TM) i9-13900KS
CPU Family: 0x6
CPU Model: 0xb7
CPU Stepping: 0x1
CPU Type: 0x0
Speed: 3187 MHz
32 logical processors
24 physical processors
Hyper-threading: Supported
FCMOV: Supported
SSE2: Supported
SSE3: Supported
SSSE3: Supported
SSE4a: Unsupported
SSE41: Supported
SSE42: Supported
AES: Supported
AVX: Supported
AVX2: Supported
AVX512F: Unsupported
AVX512PF: Unsupported
AVX512ER: Unsupported
AVX512CD: Unsupported
AVX512VNNI: Unsupported
SHA: Supported
CMPXCHG16B: Supported
LAHF/SAHF: Supported
PrefetchW: Unsupported
Operating System Version:
Windows 11 (64 bit)
NTFS: Supported
Crypto Provider Codes: Supported 311 0x0 0x0 0x0
Video Card:
Driver: Intel(R) UHD Graphics 770
DirectX Driver Name: nvldumd.dll
Driver Version: 31.0.101.4502
DirectX Driver Version: 31.0.15.3667
Driver Date: 6 15 2023
OpenGL Version: 4.6
Desktop Color Depth: 32 bits per pixel
Monitor Refresh Rate: 60 Hz
DirectX Card: NVIDIA GeForce RTX 4080
VendorID: 0x10de
DeviceID: 0x2704
Revision: 0xa1
Number of Monitors: 1
Number of Logical Video Cards: 1
No SLI or Crossfire Detected
Primary Display Resolution: 3840 x 2160
Desktop Resolution: 3840 x 2160
Primary Display Size: 35.04" x 21.26" (40.98" diag), 89.0cm x 54.0cm (104.1cm diag)
Primary Bus: PCI Express 16x
Primary VRAM: 1024 MB
Supported MSAA Modes: 2x 4x 8x
Sound card:
Audio device: Speakers (Realtek(R) Audio)
Memory:
RAM: 65305 Mb
VR Hardware:
VR Headset: None detected
Miscellaneous:
UI Language: English
Media Type: Undetermined
Total Hard Disk Space Available: 5722441 MB
Largest Free Hard Disk Block: 3797197 MB
OS Install Date: Jul 11 2023
Game Controller: None detected
MAC Address hash: 11e7d138e33db1406cb6548a0eaee7619780494e
Storage:
Disk serial number hash: 601690e1
Number of SSDs: 0
Number of HDDs: 0
Number of removable drives: 0
Wow with my system lagging with your save... once add combat update it would grind to instant halt.
starfish 5/ago./2023 às 17:36 
There was a comparison test video on DSP. It seems like 13900KS is better on late game save. (戴森球计划 13900KS VS 7950X3D 深度测试)
https://www.bilibili.com/video/BV1iV411N7WW?t=1752.8

For your save, here is my spec:
CPU: i7-7700
GPU: GTX 1660S
RAM: 32GB, DDR4-3200

Results:
8 UPS, 4 FPS with no mods [i.imgur.com]
28 UPS, 14 FPS with DSPOptimizations, SampleAndHoldSim [i.imgur.com]
47 UPS, 23 FPS with above mods and entities display disabled [i.imgur.com]
Rekal 5/ago./2023 às 18:17 
I tried it out too. Frame rate fluctuated between 5 and 7 depending on what I was doing.

CPU: i7-7700K 4.5ghz ~60% utilization
GPU: GTX 1080 ~5% utilization
RAM: 13 of 16GB in use with game running

Here's the realtime CPU stats screenshot: https://steamcommunity.com/sharedfiles/filedetails/?id=3015626898
Shows 12 UPS or Updates per Second, I'm assuming it is two updates per frame? Judging from the ring visualization, the outer ring is the whole CPU usage, 98% of which is the 'Main Logic'. The next ring represents the Main Logic and 93% of it is the 'Whole Factory'. And the inner most ring represents the 'Whole Factory' which is split into 5 main contributors - Belts, Sorters, Power Systems, Facility, and Storage.

I think you may have shot yourself in the foot by storing up your science. Looking at the values storage is the second highest draw in the 'Whole Factory' subheading at nearly 22%. I'm not sure if that includes storage in logistic stations. Did you create a huge storage box location for all your science to hide in? If you unleash your science and tear down the emptying boxes you might reclaim some performance.

Power System is also pretty high ... do you have a bunch of solar panels somewhere? You might be able to claim some more FPS by scrapping those and switching everything to artificial stars. The panels individually produce so little power in the grand scheme of things but each counts as a power facility with all the animations and grid tracking that entails.

Edit: Also, turning off the real-time buttons on that screen showed improvements too.
Última edição por Rekal; 5/ago./2023 às 18:20
Nekogod 5/ago./2023 às 18:50 
Shows 12 UPS or Updates per Second, I'm assuming it is two updates per frame?
UPS is capped to 2x frame rate. There are mods that uncap it I believe.

Power System is also pretty high ... do you have a bunch of solar panels somewhere? You might be able to claim some more FPS by scrapping those and switching everything to artificial stars. The panels individually produce so little power in the grand scheme of things but each counts as a power facility with all the animations and grid tracking that entails.

Edit: Also, turning off the real-time buttons on that screen showed improvements too.

They have a planet entirely covered in ray receivers, there might be some performance savings there since they're trying to draw over 900GW and the sphere they're pulling from can only provide 1xxGW so about 80% of the ray receivers can go. Although OP may have had a mod that beefed up their sphere as the save says the dyson sphere has a capacity of 8xxGW but when I resave it drops to 187GW

And yeah having real time tracking on is it's own overhead you can gain a little by making sure that is turned off.

OP what mod did you use to speed up the game? You've got 1800 hours of ticks but the gap between the save created date and last save date is only 1600 hours and the game is currently ticking at about 1 second per 5 real seconds.

My specs

CPU: Ryzen 5900x 4.5ghz 30% utilisation setting logical cores to 12, 50% when set to 24
GPU: GTX 3090 ~30% utilization (4k)
RAM: 9.7GB of 32GB in use by the game

I'm getting 7 fps and 14 ups with logical cores set to 12 and 7 fps and 15 ups with logical cores set to 24. So 20% more utilisation results in 1 more ups, which is wild.

By using sample and hold sim and setting it to update once every 60 ticks I was able to get to 33 FPS and 60 UPS, which is very playable.
Última edição por Nekogod; 5/ago./2023 às 18:57
I expected I would do better then I did. Bit saddened by it.
Rekal 5/ago./2023 às 21:36 
Escrito originalmente por dragonsphotoworks:
I expected I would do better then I did. Bit saddened by it.
DSP is just poorly optimized right now. That's one of the big focuses the Devs have put on the next update. Because they watched their frame rate tank when they started messing with the Dark Fog too! They should be able to pull some really good software tricks to get the frames back up. I mean the mod makers are already doing it without really being able to dig under the hood so to speak.

TL;DR Processor development has stagnated as they've reached physical limitations. Wait for the Dev's to optimize.

Now if you're saddened because you thought your hardware would perform better in comparison to 6 year old chip (like my i7-7700K) there's some physics involved. A good deal of the speed gains in the last 40 years of processor advancement have been simply shrinking the size of the transistors so that the electricity has less physical distance to travel and the transistor gates take less energy to open or close thus are faster.

The i7-7700K was made using 14 nanometer technology, meaning the smallest critical dimension on the chip was 14 nanometers. The i9-13900 I believe is made using 10 nanometer technology. A definite advancement but not exactly a game changer.

They've really slowed down on shrinking dimensions because they're running in to actual physical problems with dimensions that small. Like how to provide the power to 24 different i9 cores in a chip that small without melting the whole thing? You only make 8 of the cores capable of full speed and you make the other 16 all low power (read crippled) 'efficiency cores'. Same concept if you're familiar with Intel's 'Turbo Boost' technology, it basically shuts off the cores adjacent to the core it wants to boost and shunts the power and heat budget there to run a single core at a higher power.

So most of the processor gains in the last 10 years or so has been from innovative transistor and architecture design changes (like all the various caches they've added) which is why progress has slowed down considerably. When it comes to just straight up muscling through poorly optimized code not much progress has been made. My i7 runs at 4.5ghz, your i9 performance cores pull 5.4ghz (sustained), I think the PS3's Cell processor ran at 3.2ghz in 2006. There are just physical power-in and heat-out limitations that consumer grade processors have to deal with. If you really want to see screaming fast processors, look up some of the liquid nitrogen cooled builds.

For now we'll just have to wait to see how much more the Dev's can squeeze out with software tricks.
Hmm sounds interesting but dont see any prebuilt setups can buy. Or my google skills failed me lol
JokeryEU 6/ago./2023 às 3:00 
Escrito originalmente por dragonsphotoworks:
Wow with my system lagging with your save... once add combat update it would grind to instant halt.
stop thinking combat update will make any difference to your fps since old saves wont have combat included. Most likely you cant play on old saves anymore

Escrito originalmente por dragonsphotoworks:
Hmm sounds interesting but dont see any prebuilt setups can buy. Or my google skills failed me lol
Never buy prebuilt setups are cheap for a reason, and thats to get the lowest quality components with a few mid-high end. You dont need to be an expert to assemble a PC. Since all comes with instalation guides which starts from motherboard and follow that book
Última edição por JokeryEU; 6/ago./2023 às 3:04
I meant once I get to that level of built structures and have combat on I would expect more drain on cpu/gpu resources producing lowered FPS.
EDIT - Also that is his save not mine. I restart new games all the time and expect to restart once combat released. And many more times after as they patch it lol

And by prebuilt I meant the radiator/cooler setup not whole PC. I do custom setups, I dont by preset dell pc's for example. And I am currently building own pc. Far as I read of the system seems need to jury rig a setup into your pc. Guide I was reading had more warnings then directions lol
Última edição por dragonsphotoworks; 6/ago./2023 às 4:07
Techsuport 6/ago./2023 às 4:18 
its easy to put a pc together in the google/youtube age. there are part picker guides so you can check if the parts you pick are compatible, and you will get cheaper cost for better performance.

There are going to be more performance passes coming with the combat update for sure. But since you can basically build until its a hardware bottleneck, the performance gains from software magic has to do with handling memory efficiently between ram, CPU, and GPU, so don't be stingy with what motherboard you pick, and throw in a terabit of memory in an M.2 solidstate card that slots into the motherboard. It will feel good playing anything if it can handle late game DSP
Rekal 6/ago./2023 às 6:15 
Escrito originalmente por dragonsphotoworks:
I meant once I get to that level of built structures and have combat on I would expect more drain on cpu/gpu resources producing lowered FPS.
The Devs are aware and working on it!
Escrito originalmente por Dev Notes March 17 2023:
We know that if the Dark Fog takes up the player’s development space (and CPU computing resources), the player will want to destroy it. So the activity of the Dark Fog and the player’s factories are roughly inversely proportional, and we used it to set the following performance optimization goals:

If players choose to leave the Dark Fog alone in the latter stage of the game, then the remaining massive Dark Fog hive will be a huge burden. And at this point, the Dark Fog’s entire production system (space nests, planet bases) is insensitive to players, so it doesn’t need to update as frequently as player factories.
chen.zephyr 6/ago./2023 às 6:30 
Escrito originalmente por Rekal:
I tried it out too. Frame rate fluctuated between 5 and 7 depending on what I was doing.


Power System is also pretty high ... do you have a bunch of solar panels somewhere? You might be able to claim some more FPS by scrapping those and switching everything to artificial stars. The panels individually produce so little power in the grand scheme of things but each counts as a power facility with all the animations and grid tracking that entails.

Edit: Also, turning off the real-time buttons on that screen showed improvements too.

Yes, in some planet, I used solar panel as power supply in early game. In order to to save the resource (This is 1/10 resource seed, and I planned 200K universe matrix per minute). But, in late game, I thinks the solar panel only occupied some 'minor' CPU resource.
Escrito originalmente por Rekal:
Escrito originalmente por dragonsphotoworks:
I meant once I get to that level of built structures and have combat on I would expect more drain on cpu/gpu resources producing lowered FPS.
The Devs are aware and working on it!
Escrito originalmente por Dev Notes March 17 2023:
We know that if the Dark Fog takes up the player’s development space (and CPU computing resources), the player will want to destroy it. So the activity of the Dark Fog and the player’s factories are roughly inversely proportional, and we used it to set the following performance optimization goals:

If players choose to leave the Dark Fog alone in the latter stage of the game, then the remaining massive Dark Fog hive will be a huge burden. And at this point, the Dark Fog’s entire production system (space nests, planet bases) is insensitive to players, so it doesn’t need to update as frequently as player factories.
I know, I saw the post as well :-) I read their posts all the time.
< >
Exibindo comentários 3145 de 47
Por página: 1530 50

Publicado em: 27/fev./2023 às 19:15
Mensagens: 47