GUILTY GEAR Xrd REV 2

GUILTY GEAR Xrd REV 2

View Stats:
Which one ? FXAA, MLAA, SMAA?
Which one ? FXAA, MLAA, SMAA?
< >
Showing 1-15 of 18 comments
Kicho Feb 8, 2017 @ 3:14am 
FXAA for less input delay.
Vassago Rain Feb 8, 2017 @ 3:32am 
MLAA is an intel and AMD AA solution that gives similar results to MSAA - in theory - but at a much reduced resource cost. Post-processing based. In general, MLAA is very similar to FXAA..
FXAA is essentially free, resource wise, but provides crude AA at best, and ruins the scene at worst. Also post-processing based.
SMAA is a more accurate, better quality, and computationally faster version of MLAA. Due to it not being post-processing based, it's far more resource heavy and 'slower,' but still orders of magnitude lighter than MSAA and SSAA.

Unless you're really hurting for resources, SMAA should be the one and only AA you ever think of using for this game. If you're on a 700+ series Nvidia, you can also enable MFAA through the control panel, which helps further enhancing the scene.

Note: forgot I was on the GG forum. AA is short for anti-aliasing, and the ultimate goal is to reduce the 'jaggies' seen in videogames. The best, and absolutely most effective way, is to super-sample, but this is a bruteforce method that slaughters performance. Thus MSAA, multi-sample AA, was invented. It gives almost identical results to real SSAA, but at a far smaller cost - which is still extremely steep, mind you. Try running whatever shooter you have installed in MSAA 8x mode, and see for yourself.

MLAA and FXAA are 'light' AA techniques that employ blur filters and advanced technology to smooth jaggies after the triangles have been drawn. They do their job, but create unreasonable blurring as a result, and FXAA in particular is infamous for how much it blurs scenes. SMAA is superior to both, but if you're on a bronze age machine, your FPS might take a slight hit. Consoles, the xbone in particular, are very fond of FXAA and MLAA, because they don't usually have the resources available for 'real' AA solutions.
Last edited by Vassago Rain; Feb 8, 2017 @ 4:00am
Alfred Y.Z. Feb 8, 2017 @ 4:24am 
Originally posted by Vassago Rain:
MLAA is an intel and AMD AA solution that gives similar results to MSAA - in theory - but at a much reduced resource cost. Post-processing based. In general, MLAA is very similar to FXAA..
FXAA is essentially free, resource wise, but provides crude AA at best, and ruins the scene at worst. Also post-processing based.
SMAA is a more accurate, better quality, and computationally faster version of MLAA. Due to it not being post-processing based, it's far more resource heavy and 'slower,' but still orders of magnitude lighter than MSAA and SSAA.

Unless you're really hurting for resources, SMAA should be the one and only AA you ever think of using for this game. If you're on a 700+ series Nvidia, you can also enable MFAA through the control panel, which helps further enhancing the scene.

Note: forgot I was on the GG forum. AA is short for anti-aliasing, and the ultimate goal is to reduce the 'jaggies' seen in videogames. The best, and absolutely most effective way, is to super-sample, but this is a bruteforce method that slaughters performance. Thus MSAA, multi-sample AA, was invented. It gives almost identical results to real SSAA, but at a far smaller cost - which is still extremely steep, mind you. Try running whatever shooter you have installed in MSAA 8x mode, and see for yourself.

MLAA and FXAA are 'light' AA techniques that employ blur filters and advanced technology to smooth jaggies after the triangles have been drawn. They do their job, but create unreasonable blurring as a result, and FXAA in particular is infamous for how much it blurs scenes. SMAA is superior to both, but if you're on a bronze age machine, your FPS might take a slight hit. Consoles, the xbone in particular, are very fond of FXAA and MLAA, because they don't usually have the resources available for 'real' AA solutions.

Very detailed, thanks! I use SMAA.
Heavymetalmixer Feb 8, 2017 @ 7:24am 
FXAA it's the best one IMO, not the best look but the lowest input delay and resource needs.
Last edited by Heavymetalmixer; Feb 8, 2017 @ 10:38am
Vassago Rain Feb 8, 2017 @ 7:41am 
While I'm on the subject, for those who are interested in the best graphical experience, and have at least above average hardware, due to Xrd and Rev's focus on big, detailed MODELS, as opposed to layered textures upon textures upon textures (western AAA shooter standard fare), it's not that demanding to force things like transparency multisampling, 8x super sampling (gonna need at least 4 gigs of VRAM for this), or even falling back on the good old DSR setting if you're on Nvidia. Remember that DSR resolutions don't just magically take care of themselves, but show up in the game's actual resolution lists. Enable 4k resolution there, and DSR will downsample the 4K game output to whatever your native screen resolution is. If this is an option, you can essentially turn off all forms of AA, as 4k, like SSAA and MSAA, is a far superior solution to getting rid of jaggies, but DSR isn't a realistic option for budget computers, or even weaker 'gaming' rigs.

Remember that with a 600 nvidia'/AMD equivalent, you always want texture quality forced to 'high.' Due to how it's constructed, GG is much more CPU dependant than GPU, so as long as you have a 4k series intel/AMD equivalent, you should experience very little, if any, degrading in performance, no matter what kind of settings you run - assuming you have the CPU to back it up.

Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.

For comparison purposes, the SEGA ringedge 2.0, the arcade machine the game actually runs on, at 60 FPS no less, has the following specs:

CPU: i3 540, nominally clocked at 3.07 ghz.
GPU: Nvidia GeForce GT 545, 1 gig VRAM ultra budget version.
RAM: 4 gigs DDR3.
Vassago Rain Feb 8, 2017 @ 7:51am 
Originally posted by heavymetalmixer:
FXAA it's the ebst one IMO, not the best look but the lowest input delay and resource needs.

Neither of the AA settings have any effect on input delay. See the above huge post for what's most likely causing your perceived problem with input delay. If input delay is a concern, you're better off investing in a badass monitor from the medical sector. I'm on an eizo with a rated input delay of 0.01... But this monitor was over 900 bucks, and has all the features you'd expect of such a thing (flicker free, 180 degree viewing angle, rated contrast 15000:1, 120hz, VA panel...). Asus makes some very solid ones these days (they've been with EVO since they switched to LCDs for a reason), but they don't compare to what you can find outside the general consumer and gaming sections of the online stores and sponsored products on fightan game streams.

Also avoid vsync. Even with modern day triple-buffering and dedicated hardware on GPUs, when 2 or 3 frames can be the difference between a win and a loss, well, you do the math on that.
Last edited by Vassago Rain; Feb 8, 2017 @ 7:54am
stronzolo Feb 8, 2017 @ 9:47am 
Originally posted by Vassago Rain:
While I'm on the subject, for those who are interested in the best graphical experience, and have at least above average hardware, due to Xrd and Rev's focus on big, detailed MODELS, as opposed to layered textures upon textures upon textures (western AAA shooter standard fare), it's not that demanding to force things like transparency multisampling, 8x super sampling (gonna need at least 4 gigs of VRAM for this), or even falling back on the good old DSR setting if you're on Nvidia. Remember that DSR resolutions don't just magically take care of themselves, but show up in the game's actual resolution lists. Enable 4k resolution there, and DSR will downsample the 4K game output to whatever your native screen resolution is. If this is an option, you can essentially turn off all forms of AA, as 4k, like SSAA and MSAA, is a far superior solution to getting rid of jaggies, but DSR isn't a realistic option for budget computers, or even weaker 'gaming' rigs.

Remember that with a 600 nvidia'/AMD equivalent, you always want texture quality forced to 'high.' Due to how it's constructed, GG is much more CPU dependant than GPU, so as long as you have a 4k series intel/AMD equivalent, you should experience very little, if any, degrading in performance, no matter what kind of settings you run - assuming you have the CPU to back it up.

Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.

For comparison purposes, the SEGA ringedge 2.0, the arcade machine the game actually runs on, at 60 FPS no less, has the following specs:

CPU: i3 540, nominally clocked at 3.07 ghz.
GPU: Nvidia GeForce GT 545, 1 gig VRAM ultra budget version.
RAM: 4 gigs DDR3.
ahem would mind explain how to do that pretty pls? Do i need nv-inspector?
I have an i5-4460, 8 giga ram and an nvidia geforce gtx 970
thank you in advance, very interesting posts
Last edited by stronzolo; Feb 8, 2017 @ 9:48am
Originally posted by Vassago Rain:
Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.
does this explain why it added 1f of input latency in the tests someone did on pc?
Vassago Rain Feb 8, 2017 @ 1:32pm 
Originally posted by cutie:
Originally posted by Vassago Rain:
Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.
does this explain why it added 1f of input latency in the tests someone did on pc?

Possibly. There's a lot of little things that could impact input delay, but as long as you follow common sense PC logic, you should be able to minimize it. Also, you can always measure and track frametime, which is a bit like measuring the input delay as imposed by your configuration as a whole, rather than just a faulty component, poor coding, or what have you. Since GG is a fighting game, it's always going to run at 60 FPS, essentially, which means you'll have a relatively high frametime, so any variable introduced (unreliable USB cable, trash 80 dollar monitor, HDMI cable rather than displayport, vsync...) will have much more of an effect on the overall experience than in something like counter-strike or quake, where more FPS=better, and going over/under 60 doesn't actually cause mechanical problems. Going from 22 frametime to 26 in GG is quite, quite impossible to not notice. I currently play at 16 frametime, for those who want a measuring stick.



Originally posted by stronzolo:
Originally posted by Vassago Rain:
While I'm on the subject, for those who are interested in the best graphical experience, and have at least above average hardware, due to Xrd and Rev's focus on big, detailed MODELS, as opposed to layered textures upon textures upon textures (western AAA shooter standard fare), it's not that demanding to force things like transparency multisampling, 8x super sampling (gonna need at least 4 gigs of VRAM for this), or even falling back on the good old DSR setting if you're on Nvidia. Remember that DSR resolutions don't just magically take care of themselves, but show up in the game's actual resolution lists. Enable 4k resolution there, and DSR will downsample the 4K game output to whatever your native screen resolution is. If this is an option, you can essentially turn off all forms of AA, as 4k, like SSAA and MSAA, is a far superior solution to getting rid of jaggies, but DSR isn't a realistic option for budget computers, or even weaker 'gaming' rigs.

Remember that with a 600 nvidia'/AMD equivalent, you always want texture quality forced to 'high.' Due to how it's constructed, GG is much more CPU dependant than GPU, so as long as you have a 4k series intel/AMD equivalent, you should experience very little, if any, degrading in performance, no matter what kind of settings you run - assuming you have the CPU to back it up.

Make sure the max pre-rendered frames is at 1. Having it higher is the primary causes people complain on perceived input delay, or pull the 'it's an online only combo!' Gamma correction is also something that should be on, as it keeps the colors crisper, and GG is nothing but a colorful series.

For comparison purposes, the SEGA ringedge 2.0, the arcade machine the game actually runs on, at 60 FPS no less, has the following specs:

CPU: i3 540, nominally clocked at 3.07 ghz.
GPU: Nvidia GeForce GT 545, 1 gig VRAM ultra budget version.
RAM: 4 gigs DDR3.
ahem would mind explain how to do that pretty pls? Do i need nv-inspector?
I have an i5-4460, 8 giga ram and an nvidia geforce gtx 970
thank you in advance, very interesting posts

I recommend nvidia profile inspector. It's very similar to nvidia's own control panel, functionally, at least, but you have far more control over everything, with the caveat that you often have to ID games and processes yourself, as the program won't list all of them on its own. Most of what I listed is available in the basic control panel, or should be. I haven't actually used it in quite some time, embarrassingly enough.

However, if you go with the profile inspector, you'll be locked out of DSR, which can only ever be toggled in nvidia's own control panel, under any circumstances, for reasons that elite hackers have yet to figure out.

You'll find things like gamma correction, pre-rendered frames, transparency multisampling, and the toggle that goes between 'multi-sample' to supersample X[number] under the 'manage 3D settings.' If it's not there, you have to look up the game on the tab to the right, called something like 'program settings,', or possibly go there and manually make a profile for it. The basic nvidia control panel is quite a powerful tool, if you know what the settings do.
Last edited by Vassago Rain; Feb 8, 2017 @ 2:04pm
stronzolo Feb 8, 2017 @ 2:24pm 
Originally posted by Vassago Rain:
Originally posted by cutie:
does this explain why it added 1f of input latency in the tests someone did on pc?

Possibly. There's a lot of little things that could impact input delay, but as long as you follow common sense PC logic, you should be able to minimize it. Also, you can always measure and track frametime, which is a bit like measuring the input delay as imposed by your configuration as a whole, rather than just a faulty component, poor coding, or what have you. Since GG is a fighting game, it's always going to run at 60 FPS, essentially, which means you'll have a relatively high frametime, so any variable introduced (unreliable USB cable, trash 80 dollar monitor, HDMI cable rather than displayport, vsync...) will have much more of an effect on the overall experience than in something like counter-strike or quake, where more FPS=better, and going over/under 60 doesn't actually cause mechanical problems. Going from 22 frametime to 26 in GG is quite, quite impossible to not notice. I currently play at 16 frametime, for those who want a measuring stick.



Originally posted by stronzolo:
ahem would mind explain how to do that pretty pls? Do i need nv-inspector?
I have an i5-4460, 8 giga ram and an nvidia geforce gtx 970
thank you in advance, very interesting posts

I recommend nvidia profile inspector. It's very similar to nvidia's own control panel, functionally, at least, but you have far more control over everything, with the caveat that you often have to ID games and processes yourself, as the program won't list all of them on its own. Most of what I listed is available in the basic control panel, or should be. I haven't actually used it in quite some time, embarrassingly enough.

However, if you go with the profile inspector, you'll be locked out of DSR, which can only ever be toggled in nvidia's own control panel, under any circumstances, for reasons that elite hackers have yet to figure out.

You'll find things like gamma correction, pre-rendered frames, transparency multisampling, and the toggle that goes between 'multi-sample' to supersample X[number] under the 'manage 3D settings.' If it's not there, you have to look up the game on the tab to the right, called something like 'program settings,', or possibly go there and manually make a profile for it. The basic nvidia control panel is quite a powerful tool, if you know what the settings do.

Ok so i have first to set the game resolution to 3840x2160 in the REDSystemSettings.ini
file and then enable the DSR4.00x (is this the right number for a 1080p monitor?) in the Nvidia control panel?
What about using GeDoSaTo for the downsampling and th Nvidia inspector for the rest?
Last edited by stronzolo; Feb 8, 2017 @ 2:39pm
Vassago Rain Feb 8, 2017 @ 2:48pm 
No, DSR is far simpler than that. Enable the DSR sizes you want under the DSR factors tab in the 3D settings (all of them), set the blur to not blur bar to where you think it's right, then if the game supports this feature (99.9% direct X programs do) go to the usual resolution tab under options within GG Rev, and you'll see that now you can pick things like 2600xwhatever, 3200xwhoknows, and 4600xmchuge.

Pick one, see if it works without crashing the game to a CTD, then just roll. Note that DSR, while lighter than 'proper' MSAA or SSAA, is still a very demanding setting, but since GG is all models, no textures, you shouldn't have too much trouble - assuming the game actually supports it. There's a more complicated way we used before DSR, but that's like 5 steam posts to explain.

Remember also that GG has a thing where it only allows this one specific aspect ratio. It's a conscious decision by the devs, and no matter how much you try to hack the game, you're not gonna be able to bypass it.

Nvidia inspector is the poweruser's way to overclock and manage fans and voltage. Nvidia profile inspector is the poweruser version of the base nvidia control panel, and far easier to use than nvidia inspector, which can actually break things. Profile manager can't, as it doesn't have that kind of control over your hardware.
Last edited by Vassago Rain; Feb 8, 2017 @ 2:49pm
stronzolo Feb 8, 2017 @ 3:14pm 
Originally posted by Vassago Rain:
No, DSR is far simpler than that. Enable the DSR sizes you want under the DSR factors tab in the 3D settings (all of them), set the blur to not blur bar to where you think it's right, then if the game supports this feature (99.9% direct X programs do) go to the usual resolution tab under options within GG Rev, and you'll see that now you can pick things like 2600xwhatever, 3200xwhoknows, and 4600xmchuge.

Pick one, see if it works without crashing the game to a CTD, then just roll. Note that DSR, while lighter than 'proper' MSAA or SSAA, is still a very demanding setting, but since GG is all models, no textures, you shouldn't have too much trouble - assuming the game actually supports it. There's a more complicated way we used before DSR, but that's like 5 steam posts to explain.

Remember also that GG has a thing where it only allows this one specific aspect ratio. It's a conscious decision by the devs, and no matter how much you try to hack the game, you're not gonna be able to bypass it.

Nvidia inspector is the poweruser's way to overclock and manage fans and voltage. Nvidia profile inspector is the poweruser version of the base nvidia control panel, and far easier to use than nvidia inspector, which can actually break things. Profile manager can't, as it doesn't have that kind of control over your hardware.
I did as you said but the in game resolution option doewsn't show anything besides 1920x1080,fullscreen and lower. Does it mean that GGRev doesn't support it?
strider_sp Feb 8, 2017 @ 4:11pm 
The game was running fine before, at the native resolution of 1600x900 and MSAA. Rev, as sign before it, took me a time fiddling config files to get it running fine in my rig at this unsupported resolution, and I succeed, for a while.

Last week the game simply refused to keep it in the native resolution of my monitor, no matter what I tried. My only choice was enable virtual resolution at crimson driver; it is running fine at virtual 1080, except I noticed some HUD is a little blurred, since it doesn't scale as well as 3D models. Since this game is very light on reasonable gpu, there was no hit on performance.

Except... well, my lcd monitor has only vga connection, and I play through a vga-dvi adapter, meaning that I get conversion digital > analogic and then analogic > digital... too bad none of my friend have a displayport monitor for me to test input lag.

Vassago Rain Feb 9, 2017 @ 2:32pm 
Originally posted by stronzolo:
Originally posted by Vassago Rain:
No, DSR is far simpler than that. Enable the DSR sizes you want under the DSR factors tab in the 3D settings (all of them), set the blur to not blur bar to where you think it's right, then if the game supports this feature (99.9% direct X programs do) go to the usual resolution tab under options within GG Rev, and you'll see that now you can pick things like 2600xwhatever, 3200xwhoknows, and 4600xmchuge.

Pick one, see if it works without crashing the game to a CTD, then just roll. Note that DSR, while lighter than 'proper' MSAA or SSAA, is still a very demanding setting, but since GG is all models, no textures, you shouldn't have too much trouble - assuming the game actually supports it. There's a more complicated way we used before DSR, but that's like 5 steam posts to explain.

Remember also that GG has a thing where it only allows this one specific aspect ratio. It's a conscious decision by the devs, and no matter how much you try to hack the game, you're not gonna be able to bypass it.

Nvidia inspector is the poweruser's way to overclock and manage fans and voltage. Nvidia profile inspector is the poweruser version of the base nvidia control panel, and far easier to use than nvidia inspector, which can actually break things. Profile manager can't, as it doesn't have that kind of control over your hardware.
I did as you said but the in game resolution option doewsn't show anything besides 1920x1080,fullscreen and lower. Does it mean that GGRev doesn't support it?

That would be a sign that Xrd and rev don't support DSR, yes. You could try forcing resolutions through config/ini files, but those may or may not be locked out, depending on how ARC constructed the game and configured unreal engine in general.



Originally posted by strider_sp:
The game was running fine before, at the native resolution of 1600x900 and MSAA. Rev, as sign before it, took me a time fiddling config files to get it running fine in my rig at this unsupported resolution, and I succeed, for a while.

Last week the game simply refused to keep it in the native resolution of my monitor, no matter what I tried. My only choice was enable virtual resolution at crimson driver; it is running fine at virtual 1080, except I noticed some HUD is a little blurred, since it doesn't scale as well as 3D models. Since this game is very light on reasonable gpu, there was no hit on performance.

Except... well, my lcd monitor has only vga connection, and I play through a vga-dvi adapter, meaning that I get conversion digital > analogic and then analogic > digital... too bad none of my friend have a displayport monitor for me to test input lag.

Why are you using VGA in the year 2017, my man?
Heavymetalmixer Feb 9, 2017 @ 8:05pm 
Originally posted by Vassago Rain:
Originally posted by stronzolo:
I did as you said but the in game resolution option doewsn't show anything besides 1920x1080,fullscreen and lower. Does it mean that GGRev doesn't support it?

That would be a sign that Xrd and rev don't support DSR, yes. You could try forcing resolutions through config/ini files, but those may or may not be locked out, depending on how ARC constructed the game and configured unreal engine in general.



Originally posted by strider_sp:
The game was running fine before, at the native resolution of 1600x900 and MSAA. Rev, as sign before it, took me a time fiddling config files to get it running fine in my rig at this unsupported resolution, and I succeed, for a while.

Last week the game simply refused to keep it in the native resolution of my monitor, no matter what I tried. My only choice was enable virtual resolution at crimson driver; it is running fine at virtual 1080, except I noticed some HUD is a little blurred, since it doesn't scale as well as 3D models. Since this game is very light on reasonable gpu, there was no hit on performance.

Except... well, my lcd monitor has only vga connection, and I play through a vga-dvi adapter, meaning that I get conversion digital > analogic and then analogic > digital... too bad none of my friend have a displayport monitor for me to test input lag.

Why are you using VGA in the year 2017, my man?
I do too.
< >
Showing 1-15 of 18 comments
Per page: 1530 50

Date Posted: Feb 8, 2017 @ 3:08am
Posts: 18