Kiandore Jun 16, 2020 @ 8:14pm
FPS difference between 1080 and 1440
I just ordered a 1080p 25 inch, 250 hertz, IPS monitor and it is great. I am not noticing such a giant different in game smoothness once I am about 120fps or so.

So I’m considering sending it back and going for a 27 inch 1440p with a lower refresh rate. My question is, if everything game setting is the same, is there a way to estimate how fewer fps I will get if I am running a resolution of 1440p instead of 1080p

Here are the fps I’m getting now
Borderlands 2 150-200fps
Borderlands 3 90-100
Remnant from the ashes 170-200
I’m hoping to get between 60-1000fps at 1440p

I have 16gig ram
i7
GTX 1070

What could I expect to get with FPS? Is there a way to test this before I buy the monitor?
Originally posted by Supafly:
Originally posted by Kiandore:
I’m hoping to get between 60-1000fps at 1440p
can get more that 1000fps in certain things but that'd be old 2d stuff and title menus etc. I assume you ment 100fps? :steamhappy:

More on topic

2560x1440 = 3,686,400 pixels
1920x1080 = 2,073,600 pixels

3,686,400 - 2,073,600 = 1,612,800 pixels more

3,686,400 / 2,073,600 = 1.77777777

That means 1440p has around 70% more pixels than 1080p

So whatever fps you get now I'd expect you'd get around 30%. I'd actually say a bit more depending on games and settings as increasing resolution isn't purely a pixel count calculation. So I'd say times each games fps by 0.3 and use that as a baseline for the lowest you may expect to see.
So
100fps *0.3 = 30fps
150fps *0.3 = 45fps
200fps *0.3 = 60fps

Best way to actually test it is to use Nvidias DSR. It'll essentially render the game at a higher resolution and then downscale it to the 1080p display. Best you can get without a higgher resolution display.

It's in Nvidia control panel. DSR Factors. Not sure which would be closest so enable a few and apply them. Then go to change resolution and you'll see new options available. Select the one closest and apply. If none exact choose the one slightly higher as it'll give you a worst case experience.

Load game and select the newly listed Resolution options


< >
Showing 1-10 of 10 comments
The author of this thread has indicated that this post answers the original topic.
Supafly Jun 17, 2020 @ 6:19am 
Originally posted by Kiandore:
I’m hoping to get between 60-1000fps at 1440p
can get more that 1000fps in certain things but that'd be old 2d stuff and title menus etc. I assume you ment 100fps? :steamhappy:

More on topic

2560x1440 = 3,686,400 pixels
1920x1080 = 2,073,600 pixels

3,686,400 - 2,073,600 = 1,612,800 pixels more

3,686,400 / 2,073,600 = 1.77777777

That means 1440p has around 70% more pixels than 1080p

So whatever fps you get now I'd expect you'd get around 30%. I'd actually say a bit more depending on games and settings as increasing resolution isn't purely a pixel count calculation. So I'd say times each games fps by 0.3 and use that as a baseline for the lowest you may expect to see.
So
100fps *0.3 = 30fps
150fps *0.3 = 45fps
200fps *0.3 = 60fps

Best way to actually test it is to use Nvidias DSR. It'll essentially render the game at a higher resolution and then downscale it to the 1080p display. Best you can get without a higgher resolution display.

It's in Nvidia control panel. DSR Factors. Not sure which would be closest so enable a few and apply them. Then go to change resolution and you'll see new options available. Select the one closest and apply. If none exact choose the one slightly higher as it'll give you a worst case experience.

Load game and select the newly listed Resolution options


J4MESOX4D Jun 17, 2020 @ 6:50am 
27'' 1440p 144hz is the general sweet spot. A 1070 is total overkill for 1080p gaming even maxed out but 1440p is a bit of a jump so you may have to lower some settings to maintain a strong framerate. Researching each individual games benchmarks in a good start but expect to make some sacrifices especially on the latest games but 60 should be doable without drastic reductions in graphical quality.
_I_ Jun 17, 2020 @ 7:11am 
higher fps takes more cpu and gpu
higher res only needs more gpu
nullable Jun 17, 2020 @ 7:41am 
I think Suicidal Monkey covers most of the general facts about the differences. Although I'm not too confident in the math. As in I don't think the performance scales quite as linearly as his napkin math would suggest, not universally. Although I would agree that expecting rendering 70% more pixels would take 70% more work sounds reasonable.

Originally posted by Kiandore:
I just ordered a 1080p 25 inch, 250 hertz, IPS monitor and it is great. I am not noticing such a giant different in game smoothness once I am about 120fps or so.

See the law of diminishing returns.


Originally posted by Kiandore:
So I’m considering sending it back and going for a 27 inch 1440p with a lower refresh rate. My question is, if everything game setting is the same, is there a way to estimate how fewer fps I will get if I am running a resolution of 1440p instead of 1080p

Here are the fps I’m getting now
Borderlands 2 150-200fps
Borderlands 3 90-100
Remnant from the ashes 170-200
I’m hoping to get between 60-1000fps at 1440p

I have 16gig ram
i7
GTX 1070

What could I expect to get with FPS? Is there a way to test this before I buy the monitor?

Borderlands 2 and Remnant from the Ashes you'll probably be able to get 100FPS pretty easily.

Borderlands 3 will be closer to 60.

But these are rough guesses on my part. I can tell you that the wife was running on a GTX 1070 with a 1440p 144hz monitor. And most of the games she played ran fine. But she's a bit less sensitive to sub-60FPS. Final Fantasy XV and Red Dead Redemption 2 will definitely bring the card to it's knees on high settings at 1440p. Given your criteria I think you'll find after running 1440p for a bit that you'll be looking to buy a 3060/3070 when the next generation of GPUs come out.

If you plan on using your 1070 for a while you might stick with the 1080p monitor. Running a 1070 at 1440p is probably like running a 1050 ti at 1080p. It works. But you'll probably wish for more power.


Last edited by nullable; Jun 17, 2020 @ 7:42am
_I_ Jun 17, 2020 @ 8:01am 
monkeys math is off
77% more pixels (1.77x)

1/1.77 = 58% not .30

100fps at 1080 = 58fps at 1440
120 to 70
150 to 87
200 to 116

for simplicity i say just under double the pixel count, so fps at 1440 is just over half of what youd get at 1080 * when not cpu limited
Wichtelman Jun 17, 2020 @ 8:20am 
well you could look for the same spec on youtube playing the games you want to play and check how much fps they got...
upcoast Jun 17, 2020 @ 8:35am 
Tbh, I don't believe the GTX 1070 and 1440p should be together it's simply too weak to cover 1440p well.

I have played borderlands 2 and 3 very smooth on gtx1070 at 1080p.
Supafly Jun 17, 2020 @ 9:20am 
Originally posted by _I_:
monkeys math is off
77% more pixels (1.77x)

1/1.77 = 58% not .30

100fps at 1080 = 58fps at 1440
120 to 70
150 to 87
200 to 116

for simplicity i say just under double the pixel count, so fps at 1440 is just over half of what youd get at 1080 * when not cpu limited

Thanks for the correction. Math isn't always my strong suit. Don't use it besides basic addition and subtraction most of the time.

EDIT: Your math is a little wrong too.
1/1.77= 0.5649
so 56% :)

Originally posted by Brockenstein:
I think Suicidal Monkey covers most of the general facts about the differences. Although I'm not too confident in the math. As in I don't think the performance scales quite as linearly as his napkin math would suggest, not universally. Although I would agree that expecting rendering 70% more pixels would take 70% more work sounds reasonable.
Math was a little off. Though I did say it wouldn't be a 100% in parallel with pixel difference as there's more to fps than that. That's why I suggested OP use Nvidia DSR as they'd get to see what fps they are like to get at a higher resolution
Last edited by Supafly; Jun 17, 2020 @ 9:23am
Kiandore Jun 17, 2020 @ 6:04pm 
Originally posted by Suicidal Monkey:
Originally posted by Kiandore:
I’m hoping to get between 60-1000fps at 1440p
can get more that 1000fps in certain things but that'd be old 2d stuff and title menus etc. I assume you ment 100fps? :steamhappy:

More on topic

2560x1440 = 3,686,400 pixels
1920x1080 = 2,073,600 pixels

3,686,400 - 2,073,600 = 1,612,800 pixels more

3,686,400 / 2,073,600 = 1.77777777

That means 1440p has around 70% more pixels than 1080p

So whatever fps you get now I'd expect you'd get around 30%. I'd actually say a bit more depending on games and settings as increasing resolution isn't purely a pixel count calculation. So I'd say times each games fps by 0.3 and use that as a baseline for the lowest you may expect to see.
So
100fps *0.3 = 30fps
150fps *0.3 = 45fps
200fps *0.3 = 60fps

Best way to actually test it is to use Nvidias DSR. It'll essentially render the game at a higher resolution and then downscale it to the 1080p display. Best you can get without a higgher resolution display.

It's in Nvidia control panel. DSR Factors. Not sure which would be closest so enable a few and apply them. Then go to change resolution and you'll see new options available. Select the one closest and apply. If none exact choose the one slightly higher as it'll give you a worst case experience.

Load game and select the newly listed Resolution options


Thanks so much!
I tried the dsr trick and ran my games on 2k resolution and it didn’t seem to make a huge difference. Maybe 20fps less but not back. Hopefully it runs this well when I get my new monitor
SoldierScar Jun 17, 2020 @ 11:05pm 
From my experience its anywhere from 30 to 60 fps, depends on the game..but usually around 40 fps difference..gtx 1070 is weak for a 1440p/144hz panel, you need a new gpu. Id get at least 2060s/5700xt or wait for rtx 3000 series, maybe 3060 will have a good jump in performance for a reasonable price.
Last edited by SoldierScar; Jun 18, 2020 @ 12:38am
< >
Showing 1-10 of 10 comments
Per page: 1530 50

Date Posted: Jun 16, 2020 @ 8:14pm
Posts: 10