Trondheim Oct 11, 2024 @ 2:26pm
Is going above 60 fps worth it for singleplayer games?
I am planning to buy a 1440p PC for the first time. I only play singleplayer RPGs, many of which are old too. Think like 10+ years old games.

Is going above 60 fps still worth it? Or will it be overkill?

I'm partially afraid of going past 60 fps because I will never be able to go back
< >
Showing 31-45 of 46 comments
Trondheim Oct 13, 2024 @ 1:19pm 
Originally posted by Electric Cupcake:
Originally posted by Bing Chilling:
it's worth it imo.
especially if you have gsync or freesync

60hz is fine but 144hz-165hz is the sweet spot
anything above that and you start to see very diminishing returns.

OP said frame rate, not refresh rate. Totally different metric.

What is the difference between refresh rate and frame rate? Shouldn't both be the same?
_I_ Oct 13, 2024 @ 1:38pm 
Originally posted by Decline:
Originally posted by Electric Cupcake:

OP said frame rate, not refresh rate. Totally different metric.

What is the difference between refresh rate and frame rate? Shouldn't both be the same?
no

with vsync on, and not limited by cpu/gpu each refresh will be a new frame

if fps is lower than refresh, some refresh will show duplicated frame (vsync on) or frame parts (vsync off)
or gsync/freesync will lower refresh rate to match gpu output

if fps higher than refresh rate
fast vsync/old vsync will drop frames and not show some
new vsync will put the gpu to idle, and not start creating the frame until the newest created frame is being sent to the display

with vsync off, as soon as a frame is complete it begins sending info from that frame and you get tear lines
if fps is lower than refresh, 0 or 1 tear line depending on how low or where the last tear line was
if fps is higher, than its 1 or more tear line on every refresh
r.linder Oct 13, 2024 @ 9:18pm 
Originally posted by Decline:
Originally posted by Electric Cupcake:

OP said frame rate, not refresh rate. Totally different metric.

What is the difference between refresh rate and frame rate? Shouldn't both be the same?
Refresh rate is the rate of which the monitor refreshes at, the set refresh rate is the amount of times it does this per second in hertz. So for 60Hz, 60 times per second, for 240Hz, 240 times per second.

Frame rate is an entirely different metric as it's the number of frames rendered by the system per second. You only really see that metric used in regards to video game performance.
antoniobennett72 Oct 15, 2024 @ 8:21am 
I think we each have our own individual sensitivity.

I'm probably starting to get comfy somewhere in the 50's really for an RPG and maybe even a locked 30 or 45 is acceptable but can tell the difference maybe up to around 100 when there is a lot of movement.

If a game is twitchy though so far the higher the better (up to 144, which is as fast as I've been hands on with) not so much that I can consciously tell much over 100 but because I notice less eye strain and I can play longer without getting a headache or feeling tired or even bored.

Some may always be way more sensitive than me while a few others could not tell the difference between a nice smooth 30 and a locked 144.
C1REX Oct 15, 2024 @ 8:51am 
The original Final Fantasy 7 has a 15fps lock during fights. Not 60 or even 30.
And it’s kinda OK for static camera.

Some 2D animated movies are done in 12fps.

In that context for such games 60 can be plenty.
Rin Oct 15, 2024 @ 11:34am 
The CNS/eyes react in milliseconds, and the brain in seconds.
Last edited by Rin; Oct 15, 2024 @ 11:34am
Zefar Oct 15, 2024 @ 1:06pm 
60 FPS is not that smooth. When you've played on higher and being locked too 60 FPS you start to notice it more.

Even going up to 75 FPS made a difference for me.

So yes, going higher will be better. But I'd probably cap it around 120 FPS. Past that the bonus is far too small compared to how much you'd have to demand your GPU to work.
r.linder Oct 15, 2024 @ 2:05pm 
You'll notice more of a difference every time you double the frame and refresh rates.
Tonepoet Oct 15, 2024 @ 4:13pm 
Something that has to be taken into consideration is that many games are frame capped at 30 or 60, and in some games the games the logic might be tied to the frame rate. also noted that the system recommendations for Dragon's Dogma 2 only seem to target 30 F.P.S. regardless of the setting.[x.com]

The Simpsons Hit 'n Run's physics engine just breaks if you try to run it above 60 frames per second. In Touhou Project gameplay speed is tied to framerates. The intended play speed of the game is 60 F.P.S., so if you up the frame rate up to 120 F.P.S. you'll see the game moves twice as fast, making it harder to play, and at 1000 F.P.S. you'll have difficulty even just getting past Rumia.

Yeah, I know those aren't R.P.Gs. but the point is you have to be aware of how the games you want to play behave when the framerate is uncapped. If it plays poorly then going in excess is not only pointless but detrimental, so you'll find yourself applying a frame limiter anyway.

I'm not sure how your specific library of games will respond, but it's something to keep in mind, especially when playing older games. High refresh rate monitors weren't really a standard issue thing prior to 2013, which is when the 120hz Asus VG236H hit the market, and even then it takes some time for the market to adapt.

I mean yeah, you could trade off resolution fed into a C.R.T. to increase the refresh rate. The IIyama Vision Master Pro 512[www.tweaktown.com] can pull off 500hz if you're willing to play at 320p. However, that's besides the point since I doubt this was especially common practice though. I think most people were content with 60hz, with some targeting 75hz or 80hz if they were sensitive to flicker. Some hardcore competitive versus players might've targeted higher resolutions in order to gain a competitive edge over more casual players, but we're talking about single player here.

But also, 10+ year old games have softball system requirements so even a low cost system should be able to hit 120+ F.P.S. pretty easily.
Last edited by Tonepoet; Oct 15, 2024 @ 4:26pm
Bravo Phantom Oct 16, 2024 @ 12:06am 
I would suggest all to stop looking on the fps meter, rather than just turn it down.
Crank all settings to Ultra or Extra. If the game looks a bit laggy (don't look at the fps meter) then turn it down the graphics to high or medium.

And the game looks amazing as well as smooth; it's only human psychology which makes something look smooth or great in quality :bfpac:
Sigma957 Oct 16, 2024 @ 12:59am 
Blur or afterimage becomes less noticeable once you pass 30 FPS. At 60 FPS when you playback video filmed at 60 FPS the image looks more "present". Going even higher all video even if shot on film no longer looks like it is film. Does that make sense? It has a presence almost like it was shot on videotape. At higher framerates it has a videotape studio set feeling to it.
Zefar Oct 16, 2024 @ 7:29am 
Originally posted by Bravo Phantom:
I would suggest all to stop looking on the fps meter, rather than just turn it down.
Crank all settings to Ultra or Extra. If the game looks a bit laggy (don't look at the fps meter) then turn it down the graphics to high or medium.

And the game looks amazing as well as smooth; it's only human psychology which makes something look smooth or great in quality :bfpac:

No, it's literally the FPS counter that shows if the game will perform badly.

If you have a steady FPS rate that is capped you'll notice how smooth it is. But if the FPS jumps around from anywhere 20 FPS to 60 FPS you will be having a bad time.
Bravo Phantom Oct 16, 2024 @ 7:41am 
Originally posted by Zefar:
Originally posted by Bravo Phantom:
I would suggest all to stop looking on the fps meter, rather than just turn it down.
Crank all settings to Ultra or Extra. If the game looks a bit laggy (don't look at the fps meter) then turn it down the graphics to high or medium.

And the game looks amazing as well as smooth; it's only human psychology which makes something look smooth or great in quality :bfpac:

No, it's literally the FPS counter that shows if the game will perform badly.

If you have a steady FPS rate that is capped you'll notice how smooth it is. But if the FPS jumps around from anywhere 20 FPS to 60 FPS you will be having a bad time.
You're right. But that is another case only for debugging and testing. I was referring to FPS Bar always on kind of thing. Once all our settings are set and we are happy, no need to turn the fps bar on ever again :bfpac:
r.linder Oct 16, 2024 @ 7:53pm 
You only need the FPS counter when you actually have a need for it, no point worrying about it when your performance is fine

By itself is also pointless because it won't tell you why your performance is bad, you need to see utilisation stats, frametimes, 1% and 0.1% lows, temperatures, etc.
Last edited by r.linder; Oct 16, 2024 @ 7:54pm
Originally posted by Decline:
I am planning to buy a 1440p PC for the first time. I only play singleplayer RPGs, many of which are old too. Think like 10+ years old games.

Is going above 60 fps still worth it? Or will it be overkill?

I'm partially afraid of going past 60 fps because I will never be able to go back
yes it is but you need to get a monitor that supports higher refresh rates for it to really matter and to get a perceivable difference JUST KNOW THIS ONCE YOU EXPERIENCE THE DIFFERENCE YOU CAN NEVER GO BACK... For me this is a curse because after getting 240hz and 240fps constant I CRAVE IT but can't reach those framerates in most games
Last edited by 🍕🍟🍔 🍕🍟🍔; Oct 17, 2024 @ 1:10am
< >
Showing 31-45 of 46 comments
Per page: 1530 50

Date Posted: Oct 11, 2024 @ 2:26pm
Posts: 46