Frames per second
I'd like a little guidance on how to lock my frame rate. I understand the basic 30 and 60 FPS, but I've learned that 36 and 45 are multiples that can be used if desired. According to one author, 36 is a 50% increase of the cinematic 24 and that 45 is 50% of 30, which in turn is a 25% increase from the cinematic 24.

Resident Evil Village, for example, runs at 45fps on PS4, a figure the developers thought would be best for that particular console. More advanced ones run it at 60fps. Doom Classic used to run at 35fps.


I understand the equations, but not how why are used and how they factor into gameplay. Please, can you explain them to me?
< >
Показані коментарі 1630 із 114
Цитата допису pauldiazberrio:
was capped at 63 FPS, capped at 62.5, capped at 62
They thought, 60 is enough.
Цитата допису pauldiazberrio:
Цитата допису Snakub Plissken:
Yeah, I think you're assigning weird importance to arbitrary FPS values.

I mean let's start with cinematic 24FPS? What's so special about it? Is 24FPS some optimal viewing value they researched and arrived at? So why 24FPS?


  1. Film is expensive. Lower FPS directly translates into less film.
  2. When presented just so 24FPS can look pretty good to human eyes. Persistence of motion and all that.
  3. But in other formats, like video games and LCD screens 24FPS doesn't translate quite as well.

And then in the case of console, they tend to run at sub 60FPS or sub 30FPS not because of some inspired decisions, not because cinematic FPS is a great target. It's still just a compromise between what the user can tolerate, what the budget hardware can manage, and what the developers think is important. The compromise between graphic quality, screen resolution and FPS. They can obfuscate things as much as they want. If a PS4 could run Resident Evil Village at 60FPS at max settings, they'd do that. 45FPS gives them pretty good performance and pretty good visuals given the PS4's limitations.



There isn't one. Your eye isn't a camera, different parts of your eye see at different FPS. There's a lot of variables involved. And then it's all wired up to brain that's doing a thousand hacky things to efficiently process and approximate the data. It's why optical illusions are a thing, evidence of your brain trying it's best but falling over a little bit.

I will say that more FPS is generally better. However the concept of diminishing returns apply. The higher you go, sure there's still some benefit, you may be able to see it. At some points it's just good enough. Doubling something that's already excellent just doesn't have the same impact anymore. Going from 30 to 60 FPS is huge. Go from 90 to 120 FPS is a lot smaller. Because 30FPS is just barely tolerable and 90FPS is a pretty stellar experience for most games, more is nice, but you'll still be OK with only 90FPS.

And at 90FPS if you dip into the 80's or the 70's you'll still be having a good experience. But dipping from 30FPS into the 20's or teens is pretty mediocre experience because you're already near the floor.

In general I'd argue a consistent experience is best. A solid 80FPS is probably better than bouncing around between 70-99FPS.

At the end of the day it's all subjective and whatever feels best to you is the right answer.

A very insightful reply. Thank you for taking the time to write it. I can see FPS in terms of gaming is a more complex thing.

Taking into account the need for games to be rendered at a suitable frame rate, a number of games have made a common denominator a defacto setting. Mafia 2002 was capped at 63 FPS, Doom 3 was capped at 62.5, Batman Arkham series until Knight was capped at 62 and so was the first Chivalry. All these games used engines that determined that two frames over sixty offered the best balance of smoothness and graphical quality, that much is easy to see, but what was the need for those two extra frames over an already decent amount of frames? No game that I know of in recent years had it. Did they do something to enhance the frame rate?
Do note a couple of things that are paramount here and far overrule these finnicky examples you're giving.

One, framerate is entirely subjective as to what's smoothest, or acceptable or pleasing or whatever. One person is fine with 30 FPS, another not. SO you CANNOT put a finite figure on it except by experienting yourself. There is NO answer there.

Two, as others have said, it doesn't amtter what the maths are behind this, the simple rule is that unless you choose a framerate that is directly divisible to what your GPU and monitor like, then you can get tearing and other artifacts.

So find what that framerate is and find a multiple that pleases you without issues.
Цитата допису crunchyfrog:
Цитата допису pauldiazberrio:

A very insightful reply. Thank you for taking the time to write it. I can see FPS in terms of gaming is a more complex thing.

Taking into account the need for games to be rendered at a suitable frame rate, a number of games have made a common denominator a defacto setting. Mafia 2002 was capped at 63 FPS, Doom 3 was capped at 62.5, Batman Arkham series until Knight was capped at 62 and so was the first Chivalry. All these games used engines that determined that two frames over sixty offered the best balance of smoothness and graphical quality, that much is easy to see, but what was the need for those two extra frames over an already decent amount of frames? No game that I know of in recent years had it. Did they do something to enhance the frame rate?
Do note a couple of things that are paramount here and far overrule these finnicky examples you're giving.

One, framerate is entirely subjective as to what's smoothest, or acceptable or pleasing or whatever. One person is fine with 30 FPS, another not. SO you CANNOT put a finite figure on it except by experienting yourself. There is NO answer there.

Two, as others have said, it doesn't amtter what the maths are behind this, the simple rule is that unless you choose a framerate that is directly divisible to what your GPU and monitor like, then you can get tearing and other artifacts.

So find what that framerate is and find a multiple that pleases you without issues.

The point is well taken. I have done some experiments with afterburner and have determined that Halo Reach at 1920x1080, for example runs pretty at a pretty stable rate of 70 fps, with small dips to 69 with intense action. I tried 72, which is half the refresh rate or my monitor and while it was still good, the dips went a little lower to 67. At 1368x768, I achieved a stable 83 FPS. I am trying to use frame rates whose ms comes as close to a round integer number as possible.

Sniper Elite V 2 remastered runs great @1080p, 91 FPS locked. I don't know what was different about that game, but it was one of the few that remained constant.

I have tested many and my results vary wildly. For a I-4470k, 1060 3gb 16 GB RAM rig, 1080p, what would you suggest is generally the sweet spot for a high, dynamic frame rate?
Автор останньої редакції: pauldiazberrio; 14 січ. 2022 о 21:29
Цитата допису pauldiazberrio:
Цитата допису crunchyfrog:
Do note a couple of things that are paramount here and far overrule these finnicky examples you're giving.

One, framerate is entirely subjective as to what's smoothest, or acceptable or pleasing or whatever. One person is fine with 30 FPS, another not. SO you CANNOT put a finite figure on it except by experienting yourself. There is NO answer there.

Two, as others have said, it doesn't amtter what the maths are behind this, the simple rule is that unless you choose a framerate that is directly divisible to what your GPU and monitor like, then you can get tearing and other artifacts.

So find what that framerate is and find a multiple that pleases you without issues.

The point is well taken. I have done some experiments with afterburner and have determined that Halo Reach at 1920x1080, for example runs pretty at a pretty stable rate of 70 fps, with small dips to 69 with intense action. I tried 72, which is half the refresh rate or my monitor and while it was still good, the dips went a little lower to 67. At 1368x768, I achieved a stable 83 FPS. I am trying to use frame rates whose ms comes as close to a round integer number as possible.

Sniper Elite V 2 remastered runs great @1080p, 91 FPS locked. I don't know what was different about that game, but it was one of the few that remained constant.

I have tested many and my results vary wildly. For a I-4470k, 1060 3gb 16 GB RAM rig, 1080p, what would you suggest is generally the sweet spot for a high, dynamic frame rate?


Simple - what works for you.

There is NO finite number that's kind of the point. Or at least we don't know your system and your software thereon. So it's only a number that it pertinent to YOUR situation.

Your refresh rate of your monitor is what you should aim for - pure and simply. Worrying about 1 or two FPS here and there is utterly pointless as you're doing.

Just set it to the same as your monitor refresh rate and leave it at that.
The few FPS you claim are varying wildly simply aren't.

Are you seeing massive screen tearing at all? If not, then leave it alone. You're getting too hung up on numbers and not focusing on what matters - how it looks to YOU.
Цитата допису pauldiazberrio:
I am trying to use frame rates whose ms comes as close to a round integer number as possible.
And i played the games instead....... this whole time......
Цитата допису crunchyfrog:
Цитата допису pauldiazberrio:

The point is well taken. I have done some experiments with afterburner and have determined that Halo Reach at 1920x1080, for example runs pretty at a pretty stable rate of 70 fps, with small dips to 69 with intense action. I tried 72, which is half the refresh rate or my monitor and while it was still good, the dips went a little lower to 67. At 1368x768, I achieved a stable 83 FPS. I am trying to use frame rates whose ms comes as close to a round integer number as possible.

Sniper Elite V 2 remastered runs great @1080p, 91 FPS locked. I don't know what was different about that game, but it was one of the few that remained constant.

I have tested many and my results vary wildly. For a I-4470k, 1060 3gb 16 GB RAM rig, 1080p, what would you suggest is generally the sweet spot for a high, dynamic frame rate?


Simple - what works for you.

There is NO finite number that's kind of the point. Or at least we don't know your system and your software thereon. So it's only a number that it pertinent to YOUR situation.

Your refresh rate of your monitor is what you should aim for - pure and simply. Worrying about 1 or two FPS here and there is utterly pointless as you're doing.

Just set it to the same as your monitor refresh rate and leave it at that.
The few FPS you claim are varying wildly simply aren't.

Are you seeing massive screen tearing at all? If not, then leave it alone. You're getting too hung up on numbers and not focusing on what matters - how it looks to YOU.

To me, 72fps looks pretty solid. It may be the framerate I like the most and will stick with.

On a separate note, I would like to understand the process by which the various developers rounded the milisecond integer. I have seen two methods, from 2 or three down to the nearest number, e.g. 62=16.1 and 90=11.1. Others, like my 144 fps monitor round up the number to 6.9 to make it 7 miliseconds.

I am not too familiar with frame rate calculations. Between rounding up the integer to the next number and down to the nearest, which process is more efficient in the modern day?
Автор останньої редакції: pauldiazberrio; 15 січ. 2022 о 5:47
Also, the "Human eye can only see at 24FPS." is a myth. I don't know why anyone would bring it up here and claim that it's real.
It's a myth, it's been tested about 45 times, and it's well known it's a myth.
UGH.

Anything above 60FPS, your eyes might not perfectly see, but that also ties into how good your eyesight is. :P
Цитата допису Muppet among Puppets:
Цитата допису pauldiazberrio:
I am trying to use frame rates whose ms comes as close to a round integer number as possible.
And i played the games instead....... this whole time......

Yeah, that's what I do too. I don't even know the hotkeys/functions/whatever to show fps...

For myself, I also have a rather old monitor -- no variable framerates or anything; I have no idea what the rules are for those.

For a standard 60fps monitor, 60fps will do fine. Anything above is a complete waste of time; as far as I'm concerned, I don't care whether it's actually 60 or lower. I've run one benchmark a while back, for Tomb Raider 2013, to figure out settings for my Radeon 5770 card; nowadays I'm sporting a GTX 1060 so I don't need benchmarks like that anymore.

Technically, and if you can actually spot such differences, if you can't hit 60fps on such a monitor, you'd want to run on 30fps. Why is that? Because your monitor displays 60 images per second no matter what; if your game delivers 30 images per second, it means the monitor will show every game image twice, giving a smoother experience than randomly showing some images once and other images twice.

Or even flipping images once they are ready (i.e. running without VSync -- a big no-no if you want a decent experience), as that will display the old image in the upper part of the monitor, and a different image in the lower part, causing horizontal tearing of the displayed image. It might increase the fps number shown, but reduce overall output quality very noticeably... so you'd trade animation quality for a useless benchmark number.
Цитата допису Kargor:
[
Technically, and if you can actually spot such differences, if you can't hit 60fps on such a monitor, you'd want to run on 30fps. Why is that? Because your monitor displays 60 images per second no matter what; if your game delivers 30 images per second, it means the monitor will show every game image twice, giving a smoother experience than randomly showing some images once and other images twice.
With controller less frames might be ok. But with a mouse you want as much fps as needed to not have things on screen being zapping.
Its not an universal number.

One game i played fine with 45fps. And another game plays fine with 75 (and i guess 60), but unplayable with 45.
What i never found, a game that "needs" more than that.
Цитата допису pauldiazberrio:
Цитата допису crunchyfrog:


Simple - what works for you.

There is NO finite number that's kind of the point. Or at least we don't know your system and your software thereon. So it's only a number that it pertinent to YOUR situation.

Your refresh rate of your monitor is what you should aim for - pure and simply. Worrying about 1 or two FPS here and there is utterly pointless as you're doing.

Just set it to the same as your monitor refresh rate and leave it at that.
The few FPS you claim are varying wildly simply aren't.

Are you seeing massive screen tearing at all? If not, then leave it alone. You're getting too hung up on numbers and not focusing on what matters - how it looks to YOU.

To me, 72fps looks pretty solid. It may be the framerate I like the most and will stick with.

On a separate note, I would like to understand the process by which the various developers rounded the milisecond integer. I have seen two methods, from 2 or three down to the nearest number, e.g. 62=16.1 and 90=11.1. Others, like my 144 fps monitor round up the number to 6.9 to make it 7 miliseconds.

I am not too familiar with frame rate calculations. Between rounding up the integer to the next number and down to the nearest, which process is more efficient in the modern day?

You'd have to ask each deve for their own personal ways to approach this and what thir standards are.

But you will NOT get a finite answer as it differes because as I said before, it's not quite what you think it is. As long as it looks reasonable and plays, that's it.

But if you want those details, hunt some people down and ask them.
I would also add to the discussion here that it defo isn't a finite thing and what works for individuals is really part of it, plus each game differs as well as the hardware, creating a truly unqiue situation for each user.

Muppet's point about some games being good at a certain framerate while the same rate is yuck on another I've experienced too. It obviously boils down to how they code and what they tie to that framerate. I've never liked the idea of tying framerate with physics as that can make a bloody mess (or hilarious messes if you watch speedruns).

For myself I've also had issues with the old screen tearing that plagued certain games around the Unreal 3 (?) era. The PS3 especially.

As I have about 7 TVs in my house (laughingly, only 1 for watching TV) I've played around with this a fair bit, and I can play one game on the PS3 on a certain TV and get awful screen tearing then not at all on another. It's obviously down to framerates and more importantly circuits and trickery, but it's still worth mentioning as even though things may stay the same the results can still very just by being a different device.
Well, I've done a lot of experiments and I have decided that 62-125 is best for my games, depending on their resolutions and taking into account my hardware specs. I've been at it for several weeks and now I am wrapping up. It's been a tough, but rewarding journey.

One last thing I would like to know is how to choose frame approximation or frame rounding in milliseconds. In the number 62, the millisecond counter is 16.1 and 16.0 at 62.5. The same happens st 90 and 91, 100 and 101, too. All these numbers are processed at either one number under the next number, or one number after, in the milisecond counter, or in the case of 100 FPS, 10.0.

Which of these two milisecond counter models offer better performance in modern games?
Автор останньої редакції: pauldiazberrio; 2 лют. 2022 о 14:45
Цитата допису pauldiazberrio:
Well, I've done a lot of experiments and I have decided that 62-125 is best for my games, depending on their resolutions and taking into account my hardware specs. I've been at it for several weeks and now I am wrapping up. It's been a tough, but rewarding journey.

One last thing I would like to know is how to choose frame approximation or frame rounding in milliseconds. In the number 62, the millisecond counter is 16.1 and 16.0 at 62.5. The same happens st 90 and 91, 100 and 101, too. All these numbers are processed at either one number under the next number, or one number after, in the milisecond counter, or in the case of 100 FPS, 10.0.

Which of these two milisecond counter models offer better performance in modern games?
You should be able to tell US what is better...... after all this testing.
Цитата допису Muppet among Puppets:
Цитата допису pauldiazberrio:
Well, I've done a lot of experiments and I have decided that 62-125 is best for my games, depending on their resolutions and taking into account my hardware specs. I've been at it for several weeks and now I am wrapping up. It's been a tough, but rewarding journey.

One last thing I would like to know is how to choose frame approximation or frame rounding in milliseconds. In the number 62, the millisecond counter is 16.1 and 16.0 at 62.5. The same happens st 90 and 91, 100 and 101, too. All these numbers are processed at either one number under the next number, or one number after, in the milisecond counter, or in the case of 100 FPS, 10.0.

Which of these two milisecond counter models offer better performance in modern games?
You should be able to tell US what is better...... after all this testing.

Well, Activision believed 91 FPS at 10.9 was best for older games at the peak of the classic games era to correct any stutters or spikes during high image processing. Infinity Ward and Treyarch were of the same opinion for some of their games too.

ID Software thought Doom 3 nailed it at 62.5 FPS at 16.0 ms, then other companies contested it with 62 or even 63.

The 144hz frequency at 6.9 believes the same. All this is contrasted with the 240 Hz model at 4.1ms, or the 120hz at 8.3.

The new Chinese monitor at 500hz at 2ms brings us to a round figure.

I understand the nature of the beast, no problem there. What I would like to know, though, is which milisecond rounding is, for current games with high dynamics, more reliable.
All these numbers are just measurements of time passing.
They are not fractions of something real. Just lengths of time to the next image. Shorter or "longer".

Look, if it really mattered, you would know by now.
< >
Показані коментарі 1630 із 114
На сторінку: 1530 50

Опубліковано: 20 черв. 2021 о 9:43
Дописів: 114