Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Can confirm. It makes a difference after I upgraded my 8gb toaster to 12gb.
Though I guess the lags when you are facing a massive horde doesn't get affected by RAM.
I wonder if CPU speed helps as there can be a lot of things to calculate as loot and zombies accumulate
I use 'Lossless scaling' program(baught on steam for £5), cap FPS to 60 and run the game with frame generation. It then runs pretty smooth at 120 FPS(but still deeps down sometimes) and from time to time there are visible artifacts.
- holding more data, to reduce the transfer here cases
- holding obsolete files in case we need it again, to reduce the transfer here cases
Also I would recommend compressor like
https://github.com/IridiumIO/CompactGUI
for any games and zomboid in particular, because on modern (2010+) cpu it's faster to unpack data then transfer the whole file. (lz decompress is like 5 gb/s on 9600. PER CORE https://github.com/lz4/lz4)
On that note, the zomboid is compressed like to the half of size (-47% on b42 and -64% on b41,mostly maps chunks) and there never was any performance strain in any game even on ancient mobile 3210m. Because the data is loaded only once and it can fill 16 gig ram like in 4s (2 gb/s x 2core).
Won't argue that... I actually saw feedbacks on a 3gb indie game (unity engine) like "why my pc can run stalker 2 in 60fps, but this one in 15fps????".
This guy gets it. My computer is almost twice as powerful (two 4090s, 7950X3D, 128GB RAM, full specs here: https://pcpartpicker.com/user/Sunderbraze/saved/#view=fCfyFT ) but I'm getting the exact same fps average and drops he's describing when playing 4k@144hz. There is definitely a point at which hardware doesn't even matter anymore and the game is just getting slowed down by bottlenecking.
i'm just curious, why? are you playing 4 games at once?
You see, SLI has many tech caveats, main of which is - it is very driver dependent and requires a game to be built with optimization for it in mind. And even in supported games, there was load imbalance and rendering artifacts. Crossfire? Same balls, side view.
Game developers with mild masochistic tendencies can use DX 12 to enjoy the gpu load juggling... I mean, implement multi-GPU without a connector, but I don't think any of them did. I remind, it's 10 yo today, well before the dead connectors discarding a few years ago.
4090 does not have SLI and cannot be used as multiGPU in games and benchmarks. It does have Nvlink pathways on PCB though so with some soldering maybe can use it in some professional software. Also it has slower crossGpu exchange through pcie5 - but again, for games and benchmarks it is useless since there is no driver and engine support.
harsh dude.