Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
A lot of games have limited AA options, which is why I want increase it, assuming no graphical glitches.
No, antialiasing is there to remedy the staircasing of polygon edges, so they appear smooth :)
As this guy said.
There really is never a reason to max AA though. Turn on max AA and then go to a mid leavel AA in a game that has lots of AA options. Maybe you can but I cat tell a difference. There is a big difference in x0, x2, up to x8, but after that I have to focus really hard to see the difference. In my opinion it's not worth the performance hit. Even at 1080p with a 1070, some games will go to a crawl because of high AA.
Such as making the AA look better in GTAV for example.
You can simply do that on per-game basis right within NVIDIA Control Panel.
Via options such as:
- Anti-Aliasing Mode: Enhance
- Anti-Aliasing Transparency: Multi-Sample (or #x Super-Sample)
For 1080p yes it's enough.
DSR looks better but it is very inefficient. It tanks the frame rates.
Thanks
I already know about those options. Nvidia inspector has more options, some of which are not in the control panel.
No it doesn't, it's 1|1 performance.
If it tanks the FPS, well then your GPU and/or system is generally not good enough for that Resolution. Obviously you can not always expect to have a game setup for Ultra everything @ 1080p; and then enable DSR and not expect to perhaps lower a few visual settings slightly to compensate. FYI, 1440p is DSR=1.78X
Honestly if you can't run a majority of games @ 1440p/144Hz on a single 1070, you are seriously doing something wrong.
Increasing AA and SuperSampling in most games would usually impact performance more then just upscaling to maybe 1440p via DSR.
Also depending on the original clarity (pixel-pitch) of your current screen, you will need to play around with and adjust "DSR Smoothness" to have any applied DSR Resolution appear more accurately. The default smoothness of approx 33% is way too high and will usually make DSR appear "blurry"
That depends on the physical size of the screen. On a 60" screen you would be able to tell the difference between x8 and x16. But you're correct that on the average screen in the user's home, you won't be able to tell the difference.
Yes, what you are saying is true. This thread has now sparked my interest in screen size vs detectable difference.
I have two main gaming rigs, one on my 34" 3440x1440 ultra wide and one in my dedicated home theater room with a dlp projector on a 135" 1.1 gain elite screen (this is also my HTC vive VR room as well). I will compare AA settings on my 135" to see what level I can detect. Obviously detection level will differ depending on an individuals vision.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-5.html
It only manages to tie with fury x which only has 4gb but vastly superior ram speed.
For 1080p? It doesn't matter. As long as it has at least 200gb/s.