Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
That's interesting, but then why does it look sharper at 1920*1080?
anyway, higher raw pixel count is better than the image quality setting, so to answer your question, it's cause there's more pixels at 1080 than at 768.
the image quality setting is really just another type of anti aliasing.
1920x1080 is better because the image have a higher resolution.
Anyway, try to match the resolution of your monitor. The best quality is always attained by matching the monitor resolution, then you can add a bit of antialiasing by increasing the % quality.
If you want the game to look it's best then use these settings:
Rendering mode: normal
AA: tsaa
Resolution: whatever your native resolution is(example: 1920*1080,2560*1440)
Mesh: max
Refresh rate:144,120,160(whatever you want as long as the monitor supports it)
Texture: high 4gb (beyond that isn't noticeable unless you're really looking for it.)
Image quality:100%
Imo the game's visuals doesn't really shine at 1080p. You would really see how great it can look at 1440p or higher.
Hold up, we agree. So look, I have two options:
1) 1360*768 with slider up to 150%
2) 1920*1080 with slider up to 100%
(1) is the more detailed world, (2) lacks details but higher resolution, right?
No, (1) is a bit more sharper but less detailed, (2) more detailed but with jagged lines.
The best thing you can do is to make a screenshot of the same image with both settings and look at the differences yourself.
I'm still confused about how this works, mind you. For example, I remember tinkering with certain games in the past so that they'd run smoothly, and it was always a balance between resolution and texture quality/shadows/lighting quality etc.
For example, I could get Half Life to run at 1024*768, but low texture quality. However, 800*600 would work at high texture quality. Have I misunderstood these parameters all along? It seemed to me that I could sacrifice resolution for texture quality and vice versa.
Something I used to do on my gtx1060. I've just accepted my card is outdated and leave it on 1080p 100% now.
Basically you are rendering the game at an higher resolution, then downscaling it to fit to the selected resolution. Doing this the image will be interpolated and can look sharper, but you lose details because the effective resolution is lower.
Playing on a lower resolution and raising the quality % doesn't have much sense. It's something added for people with powerful videocard that can play the game at a larger resolution than their monitor can reach.
If you can play the game fine at 1920x1080 then play with that resolution.
https://imgur.com/a/PBzl6P1
It's the same (almost) image rendered with different quality percentage.
The first on top is 2560x1440 with 100% quality
The middle image is 2560x1440 with 50% quality
The third is 2560x1440 with 200% quality
As you can see, the second image (50%) is blurrier, because is renderd at 1280x770 and then upscaled.
The third is renderd at 5120x2880 (200%) and it's a bit sharper but nothing big, because is downgraded to 2560x1440 and loses most of its details. But it has a major impact on my GPU, lowering my FPS to 20/30 instead of 60. Not worth it.
In short, play with your maximum accepted resolution, and if you can raise the other graphical options of the game. Raise the quality only if you have GPU power to spare.