Is there a FPS difference when playing at a higher resolution?
To elaborate more, I'm looking at getting some new monitors for my setup, and I'm wondering if I should get a couple that have a 2K resolution or not. I currently have two 1920x1080 monitors, one of them in which I play games on, and the other one usually has discord, steam, or a web browser opened. My main question is, if I get two new monitors that have a higher resolution than my current ones, will I notice a FPS difference?

My Specs:
GPU: EVGA GeForce GTX 970 4GB
CPU: Intel Core i5 6500 (Supports Resolutions up to 4096x2304)
RAM: G.Skill Ripjaws V Series 16GB DDR4 2400
Motherboard: MSI H110m Pro Gaming
Naposledy upravil RiGGz; 29. říj. 2017 v 17.46
< >
Zobrazeno 4660 z 99 komentářů
Monk 30. říj. 2017 v 8.38 
It's about 30% more pixels, so you'll suffer a proportional FPS drop, give or take a bit.
M_RiGGz původně napsal:
Will I still suffer a significant performance difference if I were to use a 2560x1080 resolution monitor with my specs?
2560x1440 at 144Hz will take more to process than 2560x1440 at 60Hz. But honestly unless you plan on playing games that you can continually sustain at minimum 120 Frames Per Second without going under that frame rate, you really don't need a 144hz monitor. A 144hz monitor is nice, but only if your GPU has the ability to play games on it.

Bad_Motha původně napsal:
M_RiGGz původně napsal:
Well I'm still able to max out games at a high framerate. Like Destiny 2 just came out to PC recently, and i'm able to max it out at 60 fps ( I have VSYNC on so I don't know my exact FPS). But if the FPS difference is like a 1-5 fps loss, then i'm fine sacrificing it so long as having a nicer picture and a better looking monitor.

Well yea thats fine, if you're not getting input lag.
Just don't use VSync, use Fast or Adaptive, it's a heck of alot better at doing VSync then the actual one by that name.
Fast and Adpative are usually not useful unless the frame rate constantly jumps significantly higher or lower than the refresh rate. NVDIA's VSYNC is double buffered by default so there is usually no perceivable input lag. The issue I have with not using some sort of frame lock or frame rate cap is that sometimes screen tearing can occur and ruins the emersion of a game. If you have a sustained frame rate above 80 FPS, and you have a 60Hz monitor, you'll usually want to turn on VSYNC. If your frame rate hovers around 60 FPS and varies to about 65~70 or 55~65 Frames Per Second...use adaptive or fast vsync (if available). A much better solution for the latter options (though a more advanced) would be to use NVIDIA Profile Inspector and set the frame rate caps using it (a route I always use for MSFlight Simulator X)

Now...as for the input lag issues. Usually when we talk 4K resolutions on a monitor, we are usually talking about 3840x2160. The problem is that 3840x2160 is an interlace resolution digital film standards so your GPU outputs as such while your screen is trying to display it in progressive scan mode...you can easily get the progressive scan version of that resolution with 4096x2304. You will have to add it as a custom resolution if you use NVIDIA Control Panel, but it may reduce input lag if you use 4096x2304. So what's the big deal? Interlace resolutions usually already run at half the frame rate of the refresh rate on a monitor. That's because each scan line of resolution is displayed one at time.

Naposledy upravil TehSpoopyKitteh; 30. říj. 2017 v 12.44
Now...as for the input lag issues. Usually when we talk 4K resolutions on a monitor, we are usually talking about 3840x2160. The problem is that 3840x2160 is an interlace resolution digital film standards so your GPU outputs as such while your screen is trying to display it in progressive scan mode...you can easily get the progressive scan version of that resolution with 4096x2304. You will have to add it as a custom resolution if you use NVIDIA Control Panel, but it may reduce input lag if you use 4096x2304. So what's the big deal? Interlace resolutions usually already run at half the frame rate of the refresh rate on a monitor. That's because each scan line of resolution is displayed one at time.

What in the world are you going on about? Where do you get your information?

4K UHD is not interlaced... And changing the pixel output doesn't change that fact. The numbers you grabbed for that second resolution are out of nowhere and isn't a native resolution. Perhaps you are confusing it with cinema 4K (4096 x 2160)? Which that is a change in aspect ratio, not a change in scanning method.
RiGGz 30. říj. 2017 v 15.45 
M_RiGGz původně napsal:
Will I still suffer a significant performance difference if I were to use a 2560x1080 resolution monitor with my specs?
2560x1440 at 144Hz will take more to process than 2560x1440 at 60Hz. But honestly unless you plan on playing games that you can continually sustain at minimum 120 Frames Per Second without going under that frame rate, you really don't need a 144hz monitor. A 144hz monitor is nice, but only if your GPU has the ability to play games on it.
That didn't exactly answer my question still.
M_RiGGz původně napsal:
Monk původně napsal:
It will be lower, how much worse it is will vary on alot if things and how easy you spot FPS changes.
Well I'm still able to max out games at a high framerate. Like Destiny 2 just came out to PC recently, and i'm able to max it out at 60 fps ( I have VSYNC on so I don't know my exact FPS). But if the FPS difference is like a 1-5 fps loss, then i'm fine sacrificing it so long as having a nicer picture and a better looking monitor.

It's usually proportional to the resolution % like Monk put it, but also depends on many factors like VRAM. The higher resolution you go, potentially the higher VRAM usage. Easiest example is Wolfenstein II, using ultra (not max) preset at 1080p the GTX 1060 6gb is 71% faster than GTX 970 3.5gb. Keeping the settings and step up to 1440p or 4K the gtx 1060 6Gb is up to 100% faster than the gtx 970 3.5gb. 3.5gb VRAM likely to be one of the factor here.
RiGGz 30. říj. 2017 v 15.55 
Big Boom Boom původně napsal:
M_RiGGz původně napsal:
Well I'm still able to max out games at a high framerate. Like Destiny 2 just came out to PC recently, and i'm able to max it out at 60 fps ( I have VSYNC on so I don't know my exact FPS). But if the FPS difference is like a 1-5 fps loss, then i'm fine sacrificing it so long as having a nicer picture and a better looking monitor.

It's usually proportional to the resolution % like Monk put it, but also depends on many factors like VRAM. The higher resolution you go, potentially the higher VRAM usage. Easiest example is Wolfenstein II, using ultra (not max) preset at 1080p the GTX 1060 6gb is 71% faster than GTX 970 3.5gb. Keeping the settings and step up to 1440p or 4K the gtx 1060 6Gb is up to 100% faster than the gtx 970 3.5gb. 3.5gb VRAM likely to be one of the factor here.
I just wish there was a benchmark to see the difference better. I kind of see what you're saying but at the same I don't lol.
Omega 30. říj. 2017 v 16.02 
You can usually choose even non native resolutions when running benchmarks.

And you could always use virtual super resolution and play your games at that super resolution to see what FPS you get.

You can set up a virtual super resolution in your GPU control panel.
Naposledy upravil Omega; 30. říj. 2017 v 16.04
RiGGz 30. říj. 2017 v 16.14 
Omega původně napsal:
You can usually choose even non native resolutions when running benchmarks.

And you could always use virtual super resolution and play your games at that super resolution to see what FPS you get.

You can set up a virtual super resolution in your GPU control panel.
Oh sweet thanks for the tip.
It's call DSR under NVIDIA. You can set it up in NVIDIA control panel.
RiGGz 30. říj. 2017 v 16.17 
Big Boom Boom původně napsal:
It's call DSR under NVIDIA. You can set it up in NVIDIA control panel.
I found "Change Resolution" under the "Display" tab, and it says I can create a custom resolution. So I'm assuming that's the same thing?
No. It's under 3d settings, DSR.
RiGGz 30. říj. 2017 v 16.32 
Big Boom Boom původně napsal:
No. It's under 3d settings, DSR.
So If I have a 1920x1080 resolution monitor, what multiplier what I have to set to equal 2560x1080?
Omega 30. říj. 2017 v 16.38 
M_RiGGz původně napsal:
Big Boom Boom původně napsal:
No. It's under 3d settings, DSR.
So If I have a 1920x1080 resolution monitor, what multiplier what I have to set to equal 2560x1080?
1.20x

This is not 1440p. It is ultra wide 1080p.

1440p would be 2x.
RiGGz 30. říj. 2017 v 16.39 
Omega původně napsal:
M_RiGGz původně napsal:
So If I have a 1920x1080 resolution monitor, what multiplier what I have to set to equal 2560x1080?
1.20x

This is not 1440p. It is ultra wide 1080p.

1440p would be 2x.
And ultra wide 1080p would be considered as 21:9 aspect ratio correct?
< >
Zobrazeno 4660 z 99 komentářů
Na stránku: 1530 50

Datum zveřejnění: 29. říj. 2017 v 17.40
Počet příspěvků: 99