Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Usually most common is MSAA, TXAA is a blurry mess.
HDR mode makes the colors "more alive" but only if you have a good HDR screen or TV... if you don't then don't use it. I think there is not a single monitor under 600euros that can do good HDR.
By default, your GPU will render as many frames as it can. But if your GPU is about to render 112 frames in one second, and you only have a 60 Hz display, a problem arises. You can't fit 112 frames into 60 frames. So what happens is the display will refresh partway through a frame (rendering tends to go left to right, and top to bottom) and you'll get the top portion of one frame and the bottom of another. This is known as screen tearing, and there's a perfect image representation here...
https://en.wikipedia.org/wiki/Screen_tearing
If you enable some sort of syncing of this process, this will not occur. This is what v-sync does. The downsides are that your frame rate is limited to the refresh rate of your display, and it might add some input delay. Often, either are preferable to screen tearing IMO, but it's personal opinion.
Additionally, find out if your display has G-Sync or FreeSync or not. Those are implementations from nVidia and AMD that, simplified, make your display able to be be synced to a refresh rate that will vary on the fly to match your frame rate. It basically lets you get rid of the input delay downside of v-sync but not have the frame tearing.
Anti-aliasing is a method to combat aliasing, which is the "stair-step" effect of lines that aren't perfectly horizontal or vertical in raster images. The higher the number, typically the nicer the image, but the greater the performance cost. Different types of AA also exist, and not all work the same. MSAA is a "traditional" form that doesn't really work in many modern titles (and also needs transparency AA to supplement it to effect things with transparency, like fences, power lines, foliage, etc.). TAA isn't "blurry" contrary to an earlier reply. It's actually typically sharp when not in motion, and when in motion, some aliasing is loss (which is again the opposite of blurry as anti-aliasing by itself results in a slight softer or blurrier appearance), but has a downside where slower moving objects can result in "ghosting" or "afterimages". The method of AA that is actually blurry all around is FXAA, which is basically like applying a Vaseline filter to the whole scene. There's other methods, like Supersampling (which is just rendering to a higher frame buffer resolution, which is super performance expensive, and then downsampling it), and other proprietary methods (like nVidia's old CSAA, which was basically MSAA) butwe'd be here all day. These days it's typically TAA or FXAA, and sometimes the old MSAA works.
One could argue a GPU rendering frames above your refresh rate is pointless because you CAN'T get those extra frames displayed, so you're not even getting the extra performance (as frames) but your GPU is still wasting power and creating heat rendering them. Even if that doesn't bother you, frames aren't synced and the tearing can occur, and this can even occur if your frame rate is BELOW your refresh rate (it's just typically more noticeable the higher the frame rate). You may get less input delay and more "perceived smoothness" with v-sync off and a frame rate higher than your refresh rate, but not everyone wants that if it means tearing comes with it. Again, it's subjective. Saying "always" have it one way or the other is wrong.
This is also wrong. The fidelity of textures is completely irrelevant as far as aliasing goes. Aliasing is simply a by-product of how lines that aren't perfectly horizontal or vertical are rendered on raster displays with finite resolutions. Textures applied to geometry being of a higher fidelity may of course give a nicer image, but they don't do anything to reduce the aliasing of those lines.
You are correct that it reduces performance, but if you have the performance to spare, most people would probably say it's worth it as it almost always results in a better visual.
i play at this settings with nvidia gfx board on a 60 hz FHD monitor
max framerate per second (fps): 60 hz
v-sync : enabled
antialiasing: 4x
(and if you like the effect of graphics shining out of your monitor brighter) you may enable hdr...your monitor must be able to handle it->see your manual)
why do i play at this settings: it is a good balance between power/consumption and felt gameplay(smooth enough). high fps gaming is not really neccesary, i think. its slightly better...for esports freaks.
you should test some settings. cool is setting the fps to monitor refreshrate. this is the best.
if you know what i mean?
;-D