Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
AA (Anti-Aliasing) works by redrawing each of the frames over and over, sharping the edges. You will notice there's already various different types of AA, then they have the amount of passes to do - when you get up to 8x for example, this would redraw each individual frame 8 times, therefore taking a heavy load on the graphics card, making it 8 times the load!
DSR is only available on some Nvidia graphic cards. It sets the game resolution to be higher than what your monitor can handle - which cases the game to draw each individual frame once at a higher quality. Then the graphic card down-samples those frames to the native resolution of what the monitor handles.
In other words: Game runs at 4K UltraHD resolution > Downsampled to 1440p or 1080p (whatever your monitor can handle).
The image edges already look sharper at the higher resolution, so it can calculate them and try keep that quality sharpness when down-sampling it.
Note: It depends strongly on the game and your system how well this will work. Some games mess up, because well 3K or 4K UltraHD isn't really supported that well in the first place and can put a lot more load on the system. If you have any issues, you just have to disable DSR by setting it's in-game resolution to default or under GeForce Experience.
More details about DSR:
http://www.geforce.com/hardware/technology/dsr/technology
I know what you're going to say, that it increases the resolution and then down-samples that... but my question is... AFTER it gets downsampled back to fit my monitor WHAT EXACTLY is retained (besides the antialiasing).... Is the Textures and color, and quality increased?
at 1080p with max ingame AA = nice but can still see jaggies
at 4k resolution with no ingame AA = literally almost no jaggies
However, my question still remains... Does it improve Textures? Colors? I think I noticed the textures and colors looking better, but I'm bad with remembering how something looked once i change resolutions lolol....
But based on the geforce link you showed above, the image does look more "complete"...
http://www.geforce.com/hardware/technology/dsr/technology
AntiAliasing will be more "complete"
Colors will be more "complete" and thicker..
Textures will be more "complete" and popping, because everything is closer together??
I think i got that right .....
For example, here's a screenshot of Dark Souls 2 running at 5K through DSR; http://i63.photobucket.com/albums/h152/iHate-You/other/DarkSoulsII_2015_04_23_06_19_46_097.jpg~original
You can see the textures are much more detailed and cleaner, and should look very sharp when scaled back to 1440p, but the HUD elements are so blurry they won't look any better.
what temps during sli is ok? and not ok? thanks