Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I was struggling to get even 20fps at 1080 on Epic or even High, even though I slightly exceed those specs. The only setting that makes the game remotely playable is ironically if AMD Superfx Res is set to at least Balanced. That way it doesn't look like i'm playing a game from 2003.
I slightly OC'd everything even my damn RAM, I dropped in the -USEALLAVAILABLECORES launch option, I am willing to run the game on what it considers "auto" settings and I still can't quite hit 1080/60.
Hopefully they patch this soon, these issues feel like they can be optimized. I'm not going to give up on the game or drop a hasty bad review (AHEM!) but I am disappoint...
Please whoever works on this game, iron out the framerate issues and you have a gem game here.
Of course someone
I think that means there is hope this can be patched as both games are from 20th Studios. Any cvar changes or launch option changes and whatnot in the meantime would be appreciated.
I had to raise the GPU clock itself 100mhz and the vram 300mhz in GPU Tweak which stayed stable and got me a good 20 extra fps which I thought was crazy. I also got a huge performance gain in Everspace 2 which now also runs in a locked 1080/60 on Epic when it used to be constantly dip into the 40's.
Just so this is also preserved somewhere, don't listen to the edgelords you find in tech support forums from google results. A bunch of clowns from Tom's Hardware and whatnot are saying overclocking a 3060 is "pointless," will only yield 1-2 extra frames and is "only for tinkerers." That is BS. I got 20 extra frames easily in my most stressing games and temps are hovering around 73 when danger hits around 90.
Give OC'ing your GPU a try if you're savvy and able, it worked for me. I know this isn't really a "fix" but the devs dropped a huge patch today so we'll see what happens.
It was a hotfix with 3 changes bro lol
Also the 2070 has a benefit over the 3060 when it comes to it's VRAM cause it a wider bus bandwidth of 256 bits instead of the 192 that the 3060 has, the game seems to be VRAM heavy for some reason (probably due to the absolutely huge and handcrafted maps loaded in in their entirety the whole time), that's probably also why you see such improved from your VRAM OC.
The conclusion is that devs likely skipped the whole optimization process because it was "good enough" out of the box for their 60fps standard as the game assets and stuff isn't really demanding to run w/o optimizations.
And the same problem is basically with nearly every game coming out nowadays, devs are like "Need graphics optimization? Turn on DLSS/FSR2 to make your game look like ♥♥♥♥ but run fine".
And it's not even true. A 4090 can't push the game at 144fps on 4K monitor. Even with FSR activated. I think I'm sometimes below 100fps if FSR is off. I know it's not too bad, but it's a 4090, and what i'm seeing on the screen doesn't justify those performances. Seems like it needs optimization.
No it doesnt. It drops from 120 fps (capped) to 80fps on many occasions in the first mission.
I have the 4090 as well, the FPS drop is real.
The first mission doesnt have a lot of going on in terms of AI + I have a 5800X3D, should be enough.
Wouldn't be surprised if many perf problems are related to maintaining high numbers of mines and motion trackers.