mikel3113 May 2, 2016 @ 12:59pm
DSR resolution performance vs true resolution?
Let's say I'm running GTA V @ 1440p using DSR and I get 50 fps on average on my 1080p monitor. If I were to buy a 1440p monitor would I run at that exact 50 fps or is there some additional power needed to use DSR? FYI I'm trying to decide to buy a 1080p or 1440p monitor.

Also, how much better is this if at all? What if I run GTA V at 45fps on a 144hz gsync monitor vs 144 hz non gsync vs running at same 45fps on a 60hz non gsync monitor ( in any vsync mode ie off, on, adaptive)

< >
Showing 1-12 of 12 comments
Bad 💀 Motha May 2, 2016 @ 6:44pm 
NVIDIA has stated clearly that use of DSR gives u 1:1 performance.
So what you see running 1440p using DSR, that is the performance you can expect using the same specs, running on an actual 1440p res screen.

Please list your full hardware specs & OS and I can tell you settings to use, and what to expect; if the system is setup well enough.

Overall though, G-Sync monitors are not cheap by any means. 1440p + 144Hz + GSync on a 27-inch display: http://www.amazon.com/dp/B0173PEX20

Now you can go with cheaper G-Sync at say 1080p on a 24-inch

All VSync in-game and NV-CP should be off when you run benchmarks to better judge average FPS vs settings. When u actually have a G-Sync Display, VSync settings then offer a G-Sync setting. And that is what you would want to use, while it is similar to Adaptive VSync, without an actual G-Sync Display, Adaptive really only works to a certain degree. Where as with a G-Sync Display + decent GTX GPU, the Display sync sees what the GPU can output for FPS, then sync the Display to match; working hand-and-hand with your GPU's output.
Last edited by Bad 💀 Motha; May 2, 2016 @ 6:50pm
Azza ☠ May 2, 2016 @ 7:02pm 
I can confirm for you...

GTA V runs smooth @ 1440p (mine is 1600p) with very high settings on a GTX 980. You should get at least 50 FPS, unless you are doing some crazy amount of AA (Antialiasing), etc.

You would be better off running native 1440p, rather DSR. It's the similar performance cost, but there's more pixels to see, rather than down sampling to a lower resolution monitor.

The GTX 980 loves G-SYNC, they work extremely well together. However, if getting G-SYNC, don't go cheapo, rather get a high refresh rate one (144Hz/165Hz) with good response time (1-4ms). This is to make the most out of it and allow way more future proofing, without a restriction.
mikel3113 May 3, 2016 @ 4:30pm 
Originally posted by Bad-Motha:
NVIDIA has stated clearly that use of DSR gives u 1:1 performance.
So what you see running 1440p using DSR, that is the performance you can expect using the same specs, running on an actual 1440p res screen.

Please list your full hardware specs & OS and I can tell you settings to use, and what to expect; if the system is setup well enough.

Overall though, G-Sync monitors are not cheap by any means. 1440p + 144Hz + GSync on a 27-inch display: http://www.amazon.com/dp/B0173PEX20

Now you can go with cheaper G-Sync at say 1080p on a 24-inch

All VSync in-game and NV-CP should be off when you run benchmarks to better judge average FPS vs settings. When u actually have a G-Sync Display, VSync settings then offer a G-Sync setting. And that is what you would want to use, while it is similar to Adaptive VSync, without an actual G-Sync Display, Adaptive really only works to a certain degree. Where as with a G-Sync Display + decent GTX GPU, the Display sync sees what the GPU can output for FPS, then sync the Display to match; working hand-and-hand with your GPU's output.

I know what I'm in for on the prices and I've come to terms with paying $500-999, but just don't want to try and drive a 1440p display if I can't really run it at decent settings with a good fps. I think in some games 1080p on ultra settings( not talking about turning up AA to crazy levels) looks better than 1440p at high settings because DOF and lighting/shadows and textures are reduced so much that I'm not sure the trade off to going to 1440p monitor is worth it. However I feel kinda dumb buying a 1080p monitor knowing 1440p will be the norm for gaming in lets say 2 years and 4k will be the norm in 3-5 yrs I would think. Here are specs.

Also, hard to word this but how much better is this if at all? What if I run GTA V at 45fps on a 144hz gsync monitor vs running at that same 45fps on a 60hz non gsync monitor. Will I see a better image on the higher refresh rate monitor or if my fps is that low will the affect be null and void.

Intel Core i7-3770K 3.5GHz Quad-Core @4.2
Asus P8Z77-V LK ATX LGA1155
Crucial Ballistix Sport 16GB (2 x 8GB) DDR3-1600
Samsung 850 EVO-Series 120GB 2.5" SSD
Samsung 850 EVO-Series 500GB 2.5" SSD
EVGA GeForce GTX 980 4GB Superclocked ACX 2.0 @ 1499boost+400memory
Corsair 500R Black ATX Mid Tower
Thermaltake 850W ATX12V / EPS12V
Microsoft Windows 7 Home Premium SP1 OEM (64-bit)


banzaigtv May 3, 2016 @ 5:32pm 
I always thought that 4K DSR is less demanding than playing on an actual 4K screen. It's baffling to see games like Mortal Kombat X running at 4K resolution on a 1080p screen at 60 fps maxed. GPU is a GTX 980.
Bad 💀 Motha May 3, 2016 @ 5:34pm 
GTX 980 then u should have no problems running @ 1440p / 60 FPS
Some settings still need to be tweaked, turning down, but there are minimal differences with GTAV on a mix of High/Ultra vs all on Ultra anyways.
Azza ☠ May 3, 2016 @ 7:09pm 
The way DSR works...

The game itself is tricked into believing you have a higher resolution monitor. So it will depend on the game and graphics card to how well it performance, same as if it was on that native resolution monitor.

The Nvidia graphics card then down samples each frame to the native monitor resolution on the fly before it's displayed. The reason why, is edges from a higher resolution image can still appear sharper and more accuracy when reduced in size compared to that same image at an already lower resolution.
Last edited by Azza ☠; May 3, 2016 @ 7:10pm
hawkeye May 4, 2016 @ 2:34am 
From what I have read, DSR 4x and 2.25x (2 and 1.5 squared) are the settings that give the best image improvement. 4x is the best.

1440 to 1080 is 1.78x DSR, so DSR will be less effective from an image improvement perspective. (I think this is because only 1 extra pixel is added for 3 existing pixels in each axis, so not much gain on a drawn edge for the sharpening algorithm to work with.)

0% smoothness is the optimum setting, going above 33% is not recommended.

(4x DSR is the reduction e.g. 3840x2160 to 1920x1080.)

Nearly all games on native 4K versus 4K at 4x DSR have very similar fps.


mikel3113 May 7, 2016 @ 10:31pm 
Well I'm glad I waited. No question I'm getting at least a 1440p monitor now, if not a 4k. GTX 1080 leaves 1080p behind.
Revelene May 7, 2016 @ 10:42pm 
Originally posted by mikel3113:
Well I'm glad I waited. No question I'm getting at least a 1440p monitor now, if not a 4k. GTX 1080 leaves 1080p behind.

From what information we have access to, and ignoring Nvidia marketing, we have a good idea where these new cards stand.

If we make judgement based on the leaked synthetic benchmark, the 1080 will be a little more powerful than the 980 Ti & Titan X, roughly a 20-30 percent increase.

While a single 1080 could handle 4K better than a 980 Ti or Titan X, I feel safe to say that one would still need two to get decent settings and frame rate in some demanding titles.
DaRkCrAcKeR Jan 27, 2017 @ 11:42pm 
I was wondering about this as well so I did a small experiment... I have a Titan x (Pascal) a 1440p monitor and a 1080p monitor so I jumped into the witcher 3 on my 1080p monitor. all settings maxed (and some mods as well)... the fps were 125-127 (jumps b/c fps unlocked) then dsr to 1440p (x2 multiplier and 0% smoothing) on that monitor fps were 93-95 fps. Then switched over to my 1440p monitor at native was 93-95 fps.

This would indicate that there is no extra impact to your system when using dsr. To be sure I tested in the same manner several times with the same results. Well the fps were different due to location in game but the fps between 1440p native and 1440p dsr from 1080 were the same.

Hope this helps. ^^
Last edited by DaRkCrAcKeR; Jan 27, 2017 @ 11:44pm
cronic0_0 Jan 28, 2017 @ 12:18am 
This is how DSR works

https://youtu.be/_0v-wcOijsg
Defendor Apr 22, 2017 @ 5:38am 
Originally posted by aeronnav109:
I was wondering about this as well so I did a small experiment... I have a Titan x (Pascal) a 1440p monitor and a 1080p monitor so I jumped into the witcher 3 on my 1080p monitor. all settings maxed (and some mods as well)... the fps were 125-127 (jumps b/c fps unlocked) then dsr to 1440p (x2 multiplier and 0% smoothing) on that monitor fps were 93-95 fps. Then switched over to my 1440p monitor at native was 93-95 fps.

This would indicate that there is no extra impact to your system when using dsr. To be sure I tested in the same manner several times with the same results. Well the fps were different due to location in game but the fps between 1440p native and 1440p dsr from 1080 were the same.

Hope this helps. ^^
Awesome. Very helpful! Thanks! :golden:
< >
Showing 1-12 of 12 comments
Per page: 1530 50

Date Posted: May 2, 2016 @ 12:59pm
Posts: 12