Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
1. Play less intensive game modes will smaller amounts of AI/players. (This can give you a huge FPS increase, because the game doesn't have to calculate as much.)
2. Play on a better server. This might grant you a slight fps increase.
3. Use a processor and RAM with the highest clock speed you can afford. (For CPU 4.0GHz at the least, for RAM 3000 Mhz at the least is ideal.)
4. If you have an Nvidia Graphics Card you can force Arma to make physics calculations on that rather than your CPU to free up resources. Google it. (5-10 FPS increase)
Edit: After some research I found that method 4 doesn't actually work because Arma does not support PhysX, my mistake.
5. Reduce your draw distance. Enemies can and will shoot you beyond the draw distance if it's too low however. Your teammates may also call out enemies that are beyond what you can see. I generally keep mine around 1500. (5 to 10 FPS increase)
Good luck, here's hoping that Arma 4, when ever it gets made, runs on an optimized engine. If you stick with Arma 3, 25 fps will become surprisingly playable, albeit unacceptable.
Exile normally runs very well, 80+ players and you normally still get 60fps most of the time
What are your specs?
i5-6400
32 gb ram
>i5-6400
Found the issue.
The base clock speed of 2.7 GHz and even the max clock speed of 3.3 GHz is extremely low for Arma. If you don't plan on upgrading your processor soon, then expect that sort of framerate for a while. The only way to get a significant FPS boost is to buy a better one, I'm sorry to say.
So when you got 15 fps on exile your cpu hit 100% CPU usage ?
Only in terms of playing Arma 3!
Dude, high CPU usage isn't a good thing. It hard locks your entire computer. It's normal for a high performing program to only use 20%-25% because there has to be room for your computer to do other processes. I've done a lot of 3D rendering in the past, and even that, which is one of the most intensive processes you can do on a computer, doesn't use 100% of your CPU.
And the server himself does not impact the client performance, a server not enough powerfull for a high populated mission like Exile will result to a low fps server (under 5) and there will be a lot of desync or rollback but no client performance drop. (for exemple a mission with a lot of error serverside will reduce the server FPS to 1 but the client can get a solid 60fps there will be only rollback and desync)
Only a bad mission can drop down the client performance so there a lot of chance he play on a modified Exile and maybe with mods not friendly with performance.
If your cpu 4 core 3.20Ghz work at 60% when you got 15 fps on a server, you can get a i7 4 Ghz that will change nothing on the same server.
A powerfull i7 is the best for arma but its not because there a higher clock if you compare with a FX-8350 4Ghz the i7 will do better cause the i7 will share the Mem cache (smartcache), so its more fluid.
But really i play this game on a phenom, FX, i5 and i7 that change nothing i try different mission and different server if you get bad frame on a ♥♥♥♥ server or on ♥♥♥♥ a mission the cpu will change nothing, there only a difference with intel (i5 or i7) there is less micro stuttering than a AMD when there huge firefight because intel share the cpu cache and AMD don't.
Not sure how much Arma you've played but I disagree with most of this post.
1. A bad server can definately slow client side performance. Before my group got a privately owned server, we had to restart our monthly funded server box two to three times per session because of FPS lag. The mission we play is KP Liberation which is extremely stable. On this new server there is no FPS lag as a result of server frame drops. Desync and Rollback are due to poor connection.
2. A higher clock speed will give you better performance, that is irrefutable. Like I said earlier any program worth a damn will bottleneck how much of the CPU it uses because if it starts sucking up too many resources then it can hard lock the computer. Arma will always use the same percentage of resources on your processor. The faster the CPU can process that information the better your game will run.
3. Stuttering during firefights is usually due to low VRAM because the game is rendering particle effects in close detail. It's basically the only performance issue I can think of that is effected by GPU.
Most off the time its when the server is overloaded, connectivity issue result more often to a connection lost.
If a higher clock speed give better performance in Arma3 why a old phenom 3.20 Ghz beat an A10 3.9 Ghz ? because a phenom got 6Mb cache an A10 2 or 4....
Why an i5 3.3 Ghz 6 Mb cache beat a FX-8370 4 Ghz 8Mb cache ? because intel share the cache.
So on just 2 exemple the 2 higher clock cpu are worst than the weaker cause they got more or a better cache management.
Arma got an engine limitation and a higher clock speed will change nothing, i have running server for year and set up different build for this game (for me and friend) for server or client. The best result was a higher cache for AMD and same for intel.
A computer slow down when the CPU charge is up to 80% the average use for 4 core 3Ghz in arma 3 will be 60% so all the steps are executed and a higher clock will work less for execute this step, when a mission become really bad in performance (under 20) the clock speed will change nothing.
The best exemple its to compare a ryzen 16 thread 4ghz 16Mb cache with a i5 3.3 Ghz 6 Mb cache.
The i5 work better than this ryzen in Arma3 and why same story there more cache per core on the i5 (cause smartcache).
Take 2 ryzen : 16 thread 4ghz 16Mb cache/16 thread 3.7ghz 20Mb cache who is the winner the 3.7 Ghz....
I give you 4 exemple in Arma the better cpu cache management win not the frequency.
If i create a mission for a server with large amount of AI (no custom script or mods) but i do something for an average 40 fps client so to push the server but keep good frame on client, if i use my server to his maximum everything will be fine.... if i do the same thing on the same server but i enable only one core for the server whats happen ?
The AI will stop responding, there will be a lot of rollback and desync, cpu server will hit 100%, server fps will drop to 1, but client fps will not change.
So you can say everything you want the mission will impact client fps not the server (AI near player, custom script, custom mods)...
For the Vram your wrong i test different cpu on the same Graphic card its always the same answer no little stuttering on a better cache management.
Everygame is different for some game a higher frequecy is better, or more core... or a better CPU cache.