Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You're talking about SSAA: https://en.wikipedia.org/wiki/Supersampling, but the game seems to use the standard Unity's MSAA which is why the performance hit is minimum and everything still looks jagged as f*ck no matter how high you turn it up. A way to really smooth it out is creating a higher "virtual resolution" on the graphics card control panel and then telling the game to use that.
Also, I didn't know Unity's MSAA was bad, is there any reason why their implementation is worse than others?
Aside from that, yeah, I agree SSAA / supersampling is the way to go if you can. I know my GPU's control panel just won't give me the option, probably because it's a mobile GPU...
I don't know if it's better or worse than others, but I never had a lot of good results using MSAA in several games. I did some sh*t on Unity some months ago and the default MSAA option (QualitySettings.antiAliasing) didn't do a lot despite setting it to the maximum value (8), so I think these devs probably just used that method.
About the "DSR" thing, my 1060 6GB can run this at 4K, 75 FPS with MSAA at 8x most of the time (not in the menus for some reason), so the game itself can run quite fast even on modest cards. I guess this is what happens when you use a proper game engine like Unity instead of your typical obscure, "proprietary", japanese one.