Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
So what you see running 1440p using DSR, that is the performance you can expect using the same specs, running on an actual 1440p res screen.
Please list your full hardware specs & OS and I can tell you settings to use, and what to expect; if the system is setup well enough.
Overall though, G-Sync monitors are not cheap by any means. 1440p + 144Hz + GSync on a 27-inch display: http://www.amazon.com/dp/B0173PEX20
Now you can go with cheaper G-Sync at say 1080p on a 24-inch
All VSync in-game and NV-CP should be off when you run benchmarks to better judge average FPS vs settings. When u actually have a G-Sync Display, VSync settings then offer a G-Sync setting. And that is what you would want to use, while it is similar to Adaptive VSync, without an actual G-Sync Display, Adaptive really only works to a certain degree. Where as with a G-Sync Display + decent GTX GPU, the Display sync sees what the GPU can output for FPS, then sync the Display to match; working hand-and-hand with your GPU's output.
GTA V runs smooth @ 1440p (mine is 1600p) with very high settings on a GTX 980. You should get at least 50 FPS, unless you are doing some crazy amount of AA (Antialiasing), etc.
You would be better off running native 1440p, rather DSR. It's the similar performance cost, but there's more pixels to see, rather than down sampling to a lower resolution monitor.
The GTX 980 loves G-SYNC, they work extremely well together. However, if getting G-SYNC, don't go cheapo, rather get a high refresh rate one (144Hz/165Hz) with good response time (1-4ms). This is to make the most out of it and allow way more future proofing, without a restriction.
I know what I'm in for on the prices and I've come to terms with paying $500-999, but just don't want to try and drive a 1440p display if I can't really run it at decent settings with a good fps. I think in some games 1080p on ultra settings( not talking about turning up AA to crazy levels) looks better than 1440p at high settings because DOF and lighting/shadows and textures are reduced so much that I'm not sure the trade off to going to 1440p monitor is worth it. However I feel kinda dumb buying a 1080p monitor knowing 1440p will be the norm for gaming in lets say 2 years and 4k will be the norm in 3-5 yrs I would think. Here are specs.
Also, hard to word this but how much better is this if at all? What if I run GTA V at 45fps on a 144hz gsync monitor vs running at that same 45fps on a 60hz non gsync monitor. Will I see a better image on the higher refresh rate monitor or if my fps is that low will the affect be null and void.
Intel Core i7-3770K 3.5GHz Quad-Core @4.2
Asus P8Z77-V LK ATX LGA1155
Crucial Ballistix Sport 16GB (2 x 8GB) DDR3-1600
Samsung 850 EVO-Series 120GB 2.5" SSD
Samsung 850 EVO-Series 500GB 2.5" SSD
EVGA GeForce GTX 980 4GB Superclocked ACX 2.0 @ 1499boost+400memory
Corsair 500R Black ATX Mid Tower
Thermaltake 850W ATX12V / EPS12V
Microsoft Windows 7 Home Premium SP1 OEM (64-bit)
Some settings still need to be tweaked, turning down, but there are minimal differences with GTAV on a mix of High/Ultra vs all on Ultra anyways.
The game itself is tricked into believing you have a higher resolution monitor. So it will depend on the game and graphics card to how well it performance, same as if it was on that native resolution monitor.
The Nvidia graphics card then down samples each frame to the native monitor resolution on the fly before it's displayed. The reason why, is edges from a higher resolution image can still appear sharper and more accuracy when reduced in size compared to that same image at an already lower resolution.
1440 to 1080 is 1.78x DSR, so DSR will be less effective from an image improvement perspective. (I think this is because only 1 extra pixel is added for 3 existing pixels in each axis, so not much gain on a drawn edge for the sharpening algorithm to work with.)
0% smoothness is the optimum setting, going above 33% is not recommended.
(4x DSR is the reduction e.g. 3840x2160 to 1920x1080.)
Nearly all games on native 4K versus 4K at 4x DSR have very similar fps.
From what information we have access to, and ignoring Nvidia marketing, we have a good idea where these new cards stand.
If we make judgement based on the leaked synthetic benchmark, the 1080 will be a little more powerful than the 980 Ti & Titan X, roughly a 20-30 percent increase.
While a single 1080 could handle 4K better than a 980 Ti or Titan X, I feel safe to say that one would still need two to get decent settings and frame rate in some demanding titles.
This would indicate that there is no extra impact to your system when using dsr. To be sure I tested in the same manner several times with the same results. Well the fps were different due to location in game but the fps between 1440p native and 1440p dsr from 1080 were the same.
Hope this helps. ^^
https://youtu.be/_0v-wcOijsg