Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
"Native" means the actual resolution of your monitor.
The intention is to let you use the actual resolution of the monitor for output (to not get bad scaling artifacts when trying to stretch a low resolution to a higher one). And then instead selectively scale down the resolution of the rendering pipeline.
This is a good idea in general, because a lot of these rendering passes are very expensive. And frequently just does not have the precision that you will be able to see in the final rendering target anyway (imagine specifying the location of a smoke effect to a specific pixel - and then using that information to put in a vague smoke-effect on that precise location).
So reducing the graphics slider just prioritizes certain things in the rendering pipeline, where the ultra-settings then usually remove most of the internal scaling. Which you might not actually see anyway, except when you have some fps-dips.
Basically, you might not want to have "no scaling" unless you are taking screen-shots, and don't mind the fps going down a bit.
In the olden days, the typical way extreem l33t gam4rss! would get the best performance results would be to turn the resolution down to half of the screen resolution. And then add all the details and post-passes that would otherwise be too heavy.
But nowadays, most of the modern gpu-grunt is basically used for post-filters anyway. So there's no problem increasing the resolution targets, and then just using less post-filters. And still having a ton of compute cores free. For basically the same performance.
Although a lot of consoles still use anti-aliasing filters turned to 99 on default. Because they are still intended to be used with TVs and lower resolution "hd" tvs. And then having anti-aliasing filters on top is going to even out the pixelation a bit (never mind agree a lot more with the usually forced super-sampling that TVs still rely on to make things look smooth, at the cost of input lag).
Basically: if the game is somewhat intelligently programmed (and most games use some internal rendering target scaling, even if they don't let you use some form of FSR to control it), choose the native resolution for your rendering target. And then turn the internal scaler slightly down, rather than reduce the resolution for the final target. It is also always worth it, on a monitor, to turn anti-aliasing off (although l33t gam6aerrs will still force 88xMSAA in the nvidia settings panel, of course) and to render to a higher resolution in the first place. Rather than half resolution, or lower detail, with anti-aliasing on it.
Although opinions differ on that. A remarkable amount of people actually prefer the "anti-aliasing" with upscale over native resolution - even on the same detail level. Because apparently the game only runs well if their 3090rtx card is fuming on merely 400W. Which is usually only possible on a game nowadays by piling on every post-processing filter available, while choosing a rendering target low enough (often 1080p) so that the infinite post-passes will be able to complete in time.
It's the PC equivalent of drawing with crayons in your mouth, and then performing five hundred thousand smooth passes on it in Photoshop, and claiming that the result is actually better than if you drew with your hand.
thank you for the info man!.
so i guess native is better then ultra quality, cause native is above in the list, and actually makes me lose 15 fps. "still hangs above 70 at all times all settings ultra"...
but i should turn off anti alising? did i understand it right.
Native merely means whatever resolution the monitor is set up on.
In general, if you can get a reasonable fps-target in the screen's actual resolution(native), then pick that before adjusting the detail level. Unless there's a really cool effect you want, that you would have to turn off in native or higher resolutions. If so, then it might be worth it to do half-resolution (the graphics driver will just double each pixel now.. mostly invisible, but will avoid the irregular scaling that you obviously see) with higher details.
It's just that that's not really something you need to do in games now. At worst you'll have the UI and so on rendered in the native resolution of the screen, and then you get the "game-context" rendered with a scaler of some sort under that. With too much of that, it looks weird and skewed - but often it still looks better anyway. And it's also easier to combine with less AA-passes.
In HD2, they have a type of fsr implemented. And because it works really well, you can usually pick very high target resolutions (and native, to avoid the stretching) - and then just use the quality slider to adjust the performance afterwards. And that will typically be better than reducing the final target resolution of the graphics pipeline.
Depending on the hardware, you might want to maybe turn some of the detail settings down before putting the "scaling slider" very far towards "performance" (the volumetric and particle effects, for example). But yes, if you have 70 on ultra detail level, just reduce the scaling slider performance target if you want to (combined with an fps-target.. 60 and 70 is not really different, for example, and will save some boost-performance for when needed). And that will be infinitely better looking than reducing the resolution and increasing the detail level.
At the same time, you probably are not going to get above 90fps without avoiding 1440p and any of the anti-aliasing passes.. things like that.
Sorry about endless rants, btw.. XD it's just a lot of this stuff is ..popularly presented really badly, and based on very strange and wrong assumptions.
oh thanks =)
no worries about the big text the more to read the more info there is =)
so far im playing on max possible settings,, with this->
Motherboard - ASUS ROG Strix X670E-F Gaming WIFI
Processor - AMD Ryzen 7 7800X3D CPU 5.0GHz
GPU - ASUS GeForce RTX 3080 Ti 12GB STRIX GAMING OC
RAM - 32GB G.Skill 32GB (2x16GB) DDR5 6000MHz CL30 Trident Z5 Neo RGB Svart AMD Expo
i get round 70-120 fps never below.. both on ultra quality and native i cant really see a difference on the two tbh..