Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Thats the best thing about using resolutions above 1080p, is that you don't need AA. From 1440p, to 4k no AA is needed. Thats mostly the reason why I've switched over to pc gaming, is cause the consoles was using FXAA on every game that came out. Well that, and getting to play at a nice smooth 60 fps.
However, this is how I feel about AA in the games that I play. I'm sure some people use AA even at 4k resolutions.
I have just tried messing with the new ReShade and it was too painful and tedious so I decided against it. And I always rather stick with vanilla settings if possible.
Since you are playing on 4K as well- what are your graphics settings at the moment? (the other visuals & advanced settings). Are you using one of the presets or custom?
Any special tweak to improve performance or visuals that I should know about?
EDIT: Oh and do you use Vsync and/or Triple Buffering in this game?
I disabled the Auto-Dynamic resolution (don't remember what's it called) in Advanced. I also turned on HBAO+, and use Vsync (60fps). Not sure about Triple Buffering.
Do you turn both Vsync + Triple Buffering?
I also disabled Adaptive Resolution altogether, not sure why it's on by default.
Any ideas for good tweaks in case game gets heavy? (things to lower without losing much visual quality and detail)
I still can't quite figure what's the actual pros and cons of Triple Buffering.
What are the clear benefits vs drawbacks?
I tried on and off in D2 and saw no real difference - but I didn't test for long. I feel like I better of disabling it if it does nothing.
Another thing worth mentioning - I rarely ever noticed the drop from 60 to 45 fps people say Double Buffering Vsync (normal Vsync) has. It happened to me in a few games in the distant past- but with modern optimized games - I can drop to fps anywhere (50s, 40s etc) - without skipping and jumping from 60 to 45 because of a few fps loss. At least I don't think it happens automatically as soon as you go below 60 or it would happen a lot - because many games have stutters and stuff.
For instance - the tutorial in Dishonored 2- has a place you need to crawl under with some rats. The fps will drop from 60 to 57-60. Even with Vsync ON and TB off.
So I am not sure why people say Vsync always drop from 60 to 45 the moment you lose a single frame of performance!
Can someone explain this?
Also this game has a unqiue TB I hear, because the original TB is only supported in old OpenGL games (never happens anymore).
and 99.99% of people, including myself, will never see any cons of running triple buffering so if you do not then just use it and be happy.
People said this when 1080p was the hot new thing too. "With 1080p no AA is needed!" It wasn't true at all then if you had a good size monitor. I hope you're right that it's actually true now with 4k. I still game at 1200p so I'll see at some point.
Anyway, I use the TXAA with sharpness at max and it looks pretty good to me, just as Fallout 4 did. I think FXAA tends to look AWFUL and like a blurry mess, but TXAA to me looks a lot better. The softening is very mild and it really does kill most of the jaggies on real geometry. I only really notice them on stuff like wires and grass, and mostly in the distance. I don't think it has much of a performance impact either, so I say why not use it.
It's just some people claim Vsync always drops your FPS from 60 to 30 the moment you go below 60 (read others saying 45) - but I don't think I experienced it ever or in the last few years.
With Vsync I can get variable FPS below 60 like 58 or 55 for instance - when the FPS dips. It doesn't jump all the way down.
So what's normal here?
I should add I play this game on a TV.
I also have a 2560x1440 Gsync 144Hz Monitor- but I play my games more often on the TV (60Hz - at Game Mode) with a controller.
However it does have a performance hit of about 4-7 FPS at 4K at least.
I might just lower some Shadows, Model and background distance- down a notch or two maybe if I see stutters.
I think it really depends on the game and engine. Some games look more jaggy/smooth than others with AA either ON or OFF. So AA implementation and the games tenancy for jagged edges do factor.
And so games look more or less blurry with FXAA for instance.
I encountered games that have really subtle well implemented FXAA that barely blurs the image but manages to remove all jaggies especially those in motion. Dishonored 2's FXAA however, doesn't handle jagginess well, and have flickering distracting, noisy lines. That seem to vibrate when the camera/or environment is moving slightly. So wires and grass will have that annoying flickery Aliasing.
As for Vsync methods, I'm totally unsure what's the way to go...
I don't like tearing as they essentially ruin the image so badly that solid fps means nothing.
And I'm not sure limiting your fps is smart either and getting even more mixed suggestions: like people say limit to 59 with no Vsync and others say limit to 58 with Vsync.
Annoying and dumb. lol.
SMAA gets rid of a lot of the game's prominent jaggies while barely affecting FPS.
Another thing that helps with the "jaggie creep" the game suffers from is to turn off view bob in the options. It makes a definite difference in perceiving them while standing still.