Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I just wrote a guide too, actually. Here's my own approach
http://steamcommunity.com/sharedfiles/filedetails/?id=778281147
That means your GPU is the bottleneck if you get the same fps on both low and ultra!! i used to have a 270x and had the same problem as 2gb vram is not enough to play ultra in the first place, since i upgraded to a GPU with more than the required 4gb vram the setting make a big diffrence from low to ultra!!
Nope all that parameter garbage is just garbage. it is left over nonsense from the Arma 1 days. Feel free to use it, it won't break your game but all that parameter stuff is a waste of time.
In the arma config I have found it has detected my cpu properly, but still choose to set cpu count and extra threads. I assume -enableHT helps for intel cpu's with it. Memory limit is good if you don't have much ram and don't want windows to be bugging you that it's almost out of ram.
as for the EnableHT and all that it has been documented time and time again. Even by Bohemia themselves that it does nothing. It is left over code from ArmA 2 and the Alpha veriosn of Arma 3. As I said earlier it won't decrease or hurt your game, but it won't do anything else.
it tells engine to utilize hyperthreaded non-native cores with microjobs
you can't use it with -cpucount= as that negates it ...
https://community.bistudio.com/wiki/Arma_3_Startup_Parameters#Performance
Well I'll take your word for it Dwarden, myself and most players I know have never seen anykind of tangible or significant improvement by running those parameters. I quite literally had zero benefit from running them.
But you're a dev so I'll take your word for it.
My question is why doesn't the game auto detect these settings? Why are users in 2016 required to write some complex and abstract code starting the game up like it is some kind of MSDOS program from 1992? I mean come on.
Anyway like I said I've noticed far better performance improvements from running the game on a SSD, balancing the quality and PP and adjusting my draw distance.
To the ones say something about the auto-detect things...I think that is just for the basic settings in game. I don't think it has anything to do with making sure ARMA uses say, all 8 cores, or making sure the game uses every single thread from the CPU, or using all the VRAM from the GPU. You can even go into the config file, and change some things.
I'm not running on a crappy toaster with Win95. I have a nice rig. But I have played with different settings. I HAVE noticed big differences being on an SSD compared to an HDD. I HAVE noticed a difference using the parameters, and the 'custom' config.
I agree 100 percent with everything in here. But just from MY expierence, the parameters seem to do something. I think maybe if you have a higer than normal rig, you wont see changes with the parameter settings. That could be the cause of the many that say it doesn't do anything. I'm not sure. Just my thoughts.