Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
FXAA is essentially free, resource wise, but provides crude AA at best, and ruins the scene at worst. Also post-processing based.
SMAA is a more accurate, better quality, and computationally faster version of MLAA. Due to it not being post-processing based, it's far more resource heavy and 'slower,' but still orders of magnitude lighter than MSAA and SSAA.
Unless you're really hurting for resources, SMAA should be the one and only AA you ever think of using for this game. If you're on a 700+ series Nvidia, you can also enable MFAA through the control panel, which helps further enhancing the scene.
Note: forgot I was on the GG forum. AA is short for anti-aliasing, and the ultimate goal is to reduce the 'jaggies' seen in videogames. The best, and absolutely most effective way, is to super-sample, but this is a bruteforce method that slaughters performance. Thus MSAA, multi-sample AA, was invented. It gives almost identical results to real SSAA, but at a far smaller cost - which is still extremely steep, mind you. Try running whatever shooter you have installed in MSAA 8x mode, and see for yourself.
MLAA and FXAA are 'light' AA techniques that employ blur filters and advanced technology to smooth jaggies after the triangles have been drawn. They do their job, but create unreasonable blurring as a result, and FXAA in particular is infamous for how much it blurs scenes. SMAA is superior to both, but if you're on a bronze age machine, your FPS might take a slight hit. Consoles, the xbone in particular, are very fond of FXAA and MLAA, because they don't usually have the resources available for 'real' AA solutions.
Very detailed, thanks! I use SMAA.
Remember that with a 600 nvidia'/AMD equivalent, you always want texture quality forced to 'high.' Due to how it's constructed, GG is much more CPU dependant than GPU, so as long as you have a 4k series intel/AMD equivalent, you should experience very little, if any, degrading in performance, no matter what kind of settings you run - assuming you have the CPU to back it up.
Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.
For comparison purposes, the SEGA ringedge 2.0, the arcade machine the game actually runs on, at 60 FPS no less, has the following specs:
CPU: i3 540, nominally clocked at 3.07 ghz.
GPU: Nvidia GeForce GT 545, 1 gig VRAM ultra budget version.
RAM: 4 gigs DDR3.
Neither of the AA settings have any effect on input delay. See the above huge post for what's most likely causing your perceived problem with input delay. If input delay is a concern, you're better off investing in a badass monitor from the medical sector. I'm on an eizo with a rated input delay of 0.01... But this monitor was over 900 bucks, and has all the features you'd expect of such a thing (flicker free, 180 degree viewing angle, rated contrast 15000:1, 120hz, VA panel...). Asus makes some very solid ones these days (they've been with EVO since they switched to LCDs for a reason), but they don't compare to what you can find outside the general consumer and gaming sections of the online stores and sponsored products on fightan game streams.
Also avoid vsync. Even with modern day triple-buffering and dedicated hardware on GPUs, when 2 or 3 frames can be the difference between a win and a loss, well, you do the math on that.
I have an i5-4460, 8 giga ram and an nvidia geforce gtx 970
thank you in advance, very interesting posts
Possibly. There's a lot of little things that could impact input delay, but as long as you follow common sense PC logic, you should be able to minimize it. Also, you can always measure and track frametime, which is a bit like measuring the input delay as imposed by your configuration as a whole, rather than just a faulty component, poor coding, or what have you. Since GG is a fighting game, it's always going to run at 60 FPS, essentially, which means you'll have a relatively high frametime, so any variable introduced (unreliable USB cable, trash 80 dollar monitor, HDMI cable rather than displayport, vsync...) will have much more of an effect on the overall experience than in something like counter-strike or quake, where more FPS=better, and going over/under 60 doesn't actually cause mechanical problems. Going from 22 frametime to 26 in GG is quite, quite impossible to not notice. I currently play at 16 frametime, for those who want a measuring stick.
I recommend nvidia profile inspector. It's very similar to nvidia's own control panel, functionally, at least, but you have far more control over everything, with the caveat that you often have to ID games and processes yourself, as the program won't list all of them on its own. Most of what I listed is available in the basic control panel, or should be. I haven't actually used it in quite some time, embarrassingly enough.
However, if you go with the profile inspector, you'll be locked out of DSR, which can only ever be toggled in nvidia's own control panel, under any circumstances, for reasons that elite hackers have yet to figure out.
You'll find things like gamma correction, pre-rendered frames, transparency multisampling, and the toggle that goes between 'multi-sample' to supersample X[number] under the 'manage 3D settings.' If it's not there, you have to look up the game on the tab to the right, called something like 'program settings,', or possibly go there and manually make a profile for it. The basic nvidia control panel is quite a powerful tool, if you know what the settings do.
Ok so i have first to set the game resolution to 3840x2160 in the REDSystemSettings.ini
file and then enable the DSR4.00x (is this the right number for a 1080p monitor?) in the Nvidia control panel?
What about using GeDoSaTo for the downsampling and th Nvidia inspector for the rest?
Pick one, see if it works without crashing the game to a CTD, then just roll. Note that DSR, while lighter than 'proper' MSAA or SSAA, is still a very demanding setting, but since GG is all models, no textures, you shouldn't have too much trouble - assuming the game actually supports it. There's a more complicated way we used before DSR, but that's like 5 steam posts to explain.
Remember also that GG has a thing where it only allows this one specific aspect ratio. It's a conscious decision by the devs, and no matter how much you try to hack the game, you're not gonna be able to bypass it.
Nvidia inspector is the poweruser's way to overclock and manage fans and voltage. Nvidia profile inspector is the poweruser version of the base nvidia control panel, and far easier to use than nvidia inspector, which can actually break things. Profile manager can't, as it doesn't have that kind of control over your hardware.
Last week the game simply refused to keep it in the native resolution of my monitor, no matter what I tried. My only choice was enable virtual resolution at crimson driver; it is running fine at virtual 1080, except I noticed some HUD is a little blurred, since it doesn't scale as well as 3D models. Since this game is very light on reasonable gpu, there was no hit on performance.
Except... well, my lcd monitor has only vga connection, and I play through a vga-dvi adapter, meaning that I get conversion digital > analogic and then analogic > digital... too bad none of my friend have a displayport monitor for me to test input lag.
That would be a sign that Xrd and rev don't support DSR, yes. You could try forcing resolutions through config/ini files, but those may or may not be locked out, depending on how ARC constructed the game and configured unreal engine in general.
Why are you using VGA in the year 2017, my man?