Starfield

Starfield

View Stats:
WeelieTired Feb 19, 2024 @ 10:00am
High Frame Rates why?
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...
then, why do gamers always want super high frame rates?
Like is Starfield capped at 32 FPS? Is this the reason?
What exactly are you guys possibly seeing that "requires" spending thousand$ on gear to get higher frames?
If it's related to the Hrz refresh rate of your screen, should setting the one at a multiple of the other synch that up and solve any problems? Even on less expensive rigs?
Like, does 32 FPS and 64 or 128 Hrz = no stuttering?
If the game in question bogs down and doesnt send an FPS rate that matches up to the refresh rate of your screen, is that such a big deal as to inspire online rants at the game company? (tarkov, lol)
I'm looking for helpful techy answers so maybe I can tweak a setting and see the best I can humanly see anyway, for free, thanks.
Or, help me justify the easy thousand it will cost to get my next rig, plus ram sets and video card...
I once had a ridiculous maxed out rig, a "super computer," best in my gamer guild, for which I could have bought a car. I really don't want to do that again if my eyes can't see any faster anyway.
< >
Showing 1-15 of 15 comments
Mephitic Feb 19, 2024 @ 10:10am 
Starfield is not capped at 32 fps on PCs. Mine runs at a nominal 144 fps (the refresh rate of my display).

You may not see a flicker at 30 fps but I find it completely unacceptable. Anything less than 60 fps is not really playable for me. I can tell the difference between 60 and 120 fps too. It is different from person to person.

If you don't see the difference, then the expense of high end hardware may not work for you. Again, it is very subjective and personal. You must decide for yourself.
Liquid Inc Feb 19, 2024 @ 10:33am 
Originally posted by WeelieTired:
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...

It's actually around 70~fps that the "average" human can see where it is smooth visuals. For the average human, higher might give you a little extra smoothness to the visuals, but you'll get progressively less reward the higher you go. At higher than 144fps, you'll see little if no difference from what you saw at 70.

Some people have a higher perception, and can see a difference at higher framerates. They might see "choppiness" where others would not, and closer to 100 is where they can no longer see a difference
And likewise, some see no benefits to anything above 60fps.

Edit; Ofcourse, if your computer can't keep it at a steady rate, you'll always have choppiness as the framerate fluctuates about. That's not an eye problem though.

Originally posted by WeelieTired:
Like is Starfield capped at 32 FPS? Is this the reason?

Starfield is capped at 144, unless you uncap it, and then it's free to hit whatever manual cap you set.

Originally posted by WeelieTired:
What exactly are you guys possibly seeing that "requires" spending thousand$ on gear to get higher frames?

Too low/choppy gameplay can give people eye strain. Eyes are most certainly worth the pennies IMHO.
Last edited by Liquid Inc; Feb 19, 2024 @ 10:35am
Shalanor Feb 19, 2024 @ 10:39am 
Originally posted by Mephitic:
Starfield is not capped at 32 fps on PCs. Mine runs at a nominal 144 fps (the refresh rate of my display).

You may not see a flicker at 30 fps but I find it completely unacceptable. Anything less than 60 fps is not really playable for me. I can tell the difference between 60 and 120 fps too. It is different from person to person.

If you don't see the difference, then the expense of high end hardware may not work for you. Again, it is very subjective and personal. You must decide for yourself.


Everyone says it. Yes i can this! It's not even possible but i can see it..
When i take a Game and limit it to 40 and say it's 130..... You believe me. Cause you wanna see what you want to see! And you see a difference to another test where i say its 30..... Cause you want. This is like our brain works :). Everyone taste the cheap beer.... It's all the same :)
GoldInfinit7 Feb 19, 2024 @ 11:11am 
I want a stable frame rate at the most maximum graphics settings I could get. Getting a stable locked 60 FPS is perfectly fine for me. When running at 60 FPS and then getting 40 FPS, I notice it.

Perhaps if you're running 30 FPS, you can get used to it. But if you're running 60 FPS, and then it drops to 30 FPS, you will notice it.
Boris Feb 19, 2024 @ 11:17am 
My first 4K Tv gave me only 30Hz/FPS with 3840x2160 but it had a gaming mode. I felt no unsharpness while moving but the people in starfield which are walking in front of me from one side to another became like ghosting like watching 2-3 faces at one time from one person.
Now i am with 3840x2160 and 100Hz/FFPS and it is much better but far from perfect. My graphic card has to deliver now much more pictures in a second and i do not know/feel that the power consumption is realy woth it.
The 30 FFPS i got years ago with oblivion been about my "bad" Pc and the screen show felt a bit like a dia show and the 30Hz with my first 4k Tv felt fluide and enough pc power behind.
With more than 100Hz and my new Tv the colours became a bit bad so i took 100 Hz but is it woth it..hmm.
Shawn Feb 19, 2024 @ 11:50am 
Originally posted by WeelieTired:
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...
then, why do gamers always want super high frame rates?
Like is Starfield capped at 32 FPS? Is this the reason?

Well first humans can see up to around 60 FPS, so lots me included can't play at 30 FPS everything just looks like a slide show to us the difference between 30 and 60 most certainly is noticeable in most humans, 60 to 120 is where it starts to drop off and most people don't actually see a difference above 120 only a very few % of humans will claim they can see noticeable difference, these are the same people that claim they can see all the detail in "8k" which is just BS the difference between 4k to 8k yes on paper is great but we can actually see? at best you'd notice the very tiniest of subtle difference

you would't believe how many people i hear scream they can see 244 FPS or something silly then later find out their monitor is locked at 60 refresh rate (meaning its literally impossible for the monitor to show anything higher then 60fps Lol)


TLDR:
30 to 60 FPS is the most noticeable difference humans can see




"help me justify the easy thousand it will cost to get my next rig, plus ram sets and video card..."

ya the "start up" cost can cost you about $1,300. but remember you don't need to replace the whole machine ever x years like you do consoles so ten years down the line you might want a new GPU you can spend about $300 (or less if you shop around) and just upgrade that and carry on as opposed to having spend about $700 on new a console) add bonus you don;t lose access to your games! you will never have to worry about "oh i hope this console is backwards compatible"

alot of console player will tell you you have to update your computer ever few years... no you really don't if you always want the best of the best then yes.. but really you don't need to, a decent built computer from the get go can go a long way, so basically only update/upgrade when you have

and if you aren't doing "pre-builts" (and you really shouldn't) you can save hundreds of dollars on building your rig, avoid anything that says "gamer" and things with glass or leds congrats you just save about $600, example my computer case costs about $120 for the same exact case just with a "glass" door its $300. products marked with "gamer" is usually just normal middle range items slapped with 100s of led flashing lights to upsell for an extra $50 or so..

you could get away with a decent $1,100 build to run this game 60+ fps you most certainly won't be able to run it in 4k, but unless your part of the %1 that cares about that or have a monitor that even support 4k then who cares.
Jᴧgᴧ (Banned) Feb 19, 2024 @ 11:12pm 
Originally posted by WeelieTired:
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...
then, why do gamers always want super high frame rates?

Your assumption about the human eye/brain only being able to perceive 30 FPS is the problem, in truth. Movies and TV have a traditional limit of 24 FPS - that's backwards technology simply carrying forward. What they do to make it appear smooth, is build blurring directly into the images. Pause a movie with a lot of action and you'll see the blur clearly.

PCs and games on digital monitors by comparison use full frames, which also rely on refresh rate. Over the last ~20 years that's been 60 FPS (30 for consoles due to more limitations in hardware). More powerful systems just in the last ~7 years go up to 144hz refresh rate, and are noticeably smoother than 60 fps for anyone using them. A lot of phones have that higher refresh rate, which is what makes their UI seem so responsive and clean.

With non-blurred images (ie games on a PC), 60 frames per second is a sweet spot between smooth motion, and exorbitant cost, which is why it probably still persists today as the norm.

The human eye can actually see over 144hz refresh rate, but the number of people actually able to discern the difference over ~100-120 frames per second is a lot more rare.

Hope that helps some.
KrisG Feb 19, 2024 @ 11:31pm 
Your eyes only "sample" the world at a certain framerate. You would only perceive 30fps as smooth if your eyes were able to scan the screen at exactly 30fps. But it's never exactly 30fps, in fact, we don't know what the sampling framerate of human eye is. Due to the stroboscopic effect...

https://en.wikipedia.org/wiki/Stroboscopic_effect

...we will perceive the 30fps as choppy. The higher the fps the less noticeable the effect becomes and your eye isn't able to distinguish between framerates at higher levels.
Emphoise Feb 20, 2024 @ 7:34am 
TLDR, high framerates DO noticably change the quality and smoothness of what you see on screen.

Ask a neuro scientist for in depth explanations.
Last edited by Emphoise; Feb 20, 2024 @ 7:39am
Emphoise Feb 20, 2024 @ 7:35am 
Originally posted by Shalanor:
Originally posted by Mephitic:
Starfield is not capped at 32 fps on PCs. Mine runs at a nominal 144 fps (the refresh rate of my display).

You may not see a flicker at 30 fps but I find it completely unacceptable. Anything less than 60 fps is not really playable for me. I can tell the difference between 60 and 120 fps too. It is different from person to person.

If you don't see the difference, then the expense of high end hardware may not work for you. Again, it is very subjective and personal. You must decide for yourself.


Everyone says it. Yes i can this! It's not even possible but i can see it..
When i take a Game and limit it to 40 and say it's 130..... You believe me. Cause you wanna see what you want to see! And you see a difference to another test where i say its 30..... Cause you want. This is like our brain works :). Everyone taste the cheap beer.... It's all the same :)

I can definitely tell if IE i have vsync on or uncapped fps on certain games, without checking the settings or displaying the FPS.
Last edited by Emphoise; Feb 20, 2024 @ 7:35am
Lazarus {FATE} Feb 20, 2024 @ 8:54am 
Originally posted by WeelieTired:
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...
then, why do gamers always want super high frame rates?
Like is Starfield capped at 32 FPS? Is this the reason?
What exactly are you guys possibly seeing that "requires" spending thousand$ on gear to get higher frames?
If it's related to the Hrz refresh rate of your screen, should setting the one at a multiple of the other synch that up and solve any problems? Even on less expensive rigs?
Like, does 32 FPS and 64 or 128 Hrz = no stuttering?
If the game in question bogs down and doesnt send an FPS rate that matches up to the refresh rate of your screen, is that such a big deal as to inspire online rants at the game company? (tarkov, lol)
I'm looking for helpful techy answers so maybe I can tweak a setting and see the best I can humanly see anyway, for free, thanks.
Or, help me justify the easy thousand it will cost to get my next rig, plus ram sets and video card...
I once had a ridiculous maxed out rig, a "super computer," best in my gamer guild, for which I could have bought a car. I really don't want to do that again if my eyes can't see any faster anyway.

https://www.linkedin.com/advice/0/what-pros-cons-high-fps-vs-low-gaming-skills-pc-building

Not the best article but might help.
Last edited by Lazarus {FATE}; Feb 20, 2024 @ 8:54am
WeelieTired Feb 20, 2024 @ 11:12am 
Originally posted by Talking with Tards:
https://www.linkedin.com/advice/0/what-pros-cons-high-fps-vs-low-gaming-skills-pc-building
Not the best article but might help.

Thanks. I think you're on to something.
This really answers my questions about the human side...
https://www.healthline.com/health/human-eye-fps

Looks like Hollywood propaganda lol about why they film and show 30 FPS is behind my original thoughts.
I recall the tales of Bruce Lee being too fast to film in the Green Hornet. It looked like he kinda flinched and thugs fell over around him, because he moved So fast the 24 FPS cameras/film couldnt really show it. SO they went to 30 FPS and yeah, he's still a blur.
Well, they're not beyond that on film. The camera mechanisms need that much time to work & the film has to roll thru the machine... so they're no faster still.
Digital cameras, sure. Higher is possible, but they need digital projectors in the theater. Which most do also have now.

As to our eyes, seems we top out at 60-ish? Maybe some people go 75 max fps ability to process images.
Perceiving flickering if something is shown to us at higher or desynched rates and we're one of the few who can perceive that flickering at a higher rate, is possible...
But since your eyes and brain are not able to resolve & process images at that rate, its a comfort thing. Ending flicker is not like you're able to see frames of images at 144 if only the game and system can show you... no. That flicker you Might see is all there is there.
Maybe locking in 60 FPS and V-sync on everything on screen stays rock solid? The rare ones with expensive rigs might get something out of locking the screen refresh rate to the FPS, just no flickering, but its really not possible to SEE and process that any better than the maybe 60-75 FPS human ability to process images. No headache or whatever you get from flickering is a goal, sure.
I'm just not willing to triple the cost of my rig for that when I can just lock rates and use Vsync.

Conclusion:
Short: I'm going to focus on setting 60 FPS selected everywhere and Vsync and call it good.
Long: Build a system I can upgrade in the future if I have any issues of perceiving flicker.

Thanks guys!
Last edited by WeelieTired; Feb 20, 2024 @ 11:30am
WeelieTired Feb 20, 2024 @ 11:21am 
Thanks to everyone! I appreciate all the different perspectives, each a bit of helpful advice and knowledge on this. I learned and can work on my current settings and build a future rig off this advice. S-tier thread of conversations, guys! GG. thanks.
I shot you all some points for each positive helpful reply. Thanks again.
Tahnval Feb 20, 2024 @ 11:59am 
Originally posted by WeelieTired:
Can someone Gamersplain to me why, when the human eye/brain has a max "frame rate" or "shutter speed" of perception of about 30 frames per second max...
then, why do gamers always want super high frame rates? [..]

It's been proven by experiment that humans are capable of reliably identifying images seen for 1/240th of a second. Not just seeing that the image was there, but seeing it in sufficient detail to be able to identify it in detail. The earliest testing was done by the USAF. Test subjects were able to identify the model of plane, the insignia on the plane showing what country it was from, etc.

I'm not saying that 240 fps is needed for gaming, just that the claim that anything above 30 fps makes no difference is untrue. It's a myth. It's a myth that was debunked decades ago, but it still persists. The first video directly showing it that I recall was late 1990s, with two cubes being rendered side by side, one shown at 30fps and one at 60fps. The difference was noticeable and that was with the relatively crude 3D graphics of the late 1990s.

30 fps is more of a minimum than a maximum. The human brain is extremely good at visual processing and can do a remarkable job of stitching individual static images together to interpret them as a moving image, but there are limits. At 30 fps the illusion of movement is noticeably imperfect but mostly tolerable, generally. Not good, but probably playable. Especially for someone who hasn't played at a higher framerate.

Then there's the issue of effective update speed of the gameworld. The higher the framerate, the earlier the player will see any changes in the gameworld. That's extremely important in high speed games played competitively at a high level, which is why people who do that will trade even resolution for framerate.

I find that I can tell the difference between 75 fps (which is the framerate I'm used to) and 60 fps. But only just. Maybe if I was used to a framerate higher than 75 fps the difference between that higher framerate and 60 fps would be more noticeable to me.



Originally posted by WeelieTired:
[..]
So I'm going to focus on that. 60 FPS selected everywhere and Vsync and call it good.[..]

Variable refresh rate is better than vsync unless you can guarantee your framerate will never go below the maximum refresh rate of your monitor or your monitor doesn't support variable refresh rate (it almost certainly will - VRR has been standard for a fair few years now). Also, locking your framerate to 60 fps and turning vsync on will be worse if your monitor has a refresh rate of anything other than 60 Hz.


Variable refresh rate is given different names by different sources:

Adaptive sync is the name given by the relevant standards body for the implementation of VRR over the DisplayPort connection standard. It's also often used for VRR as a whole.

Nvidia has a proprietary version they call G-Sync.

AMD has an open version they call FreeSync.

That's now the dominant standard and Nvidia uses it too, but Nvidia refuses to use the name and instead calls it "G-Sync compatible", which is a straight up lie because it's a different standard and not at all compatible. Nvidia created G-Sync to not be compatible, so they could copyright it and thus increase their power in the industry if it became the dominant standard. Truth doesn't matter in business.

Intel calls it adaptive sync.

Here's a link to a handy summary guide to VRR:

https://www.howtogeek.com/793199/what-is-displayport-adaptive-sync/
Lowe0 Feb 20, 2024 @ 12:40pm 
Part of the problem is how many different metrics there are for "how fast can humans see". At the low end, you have the 24 fps number, which is for motion - below that, you start to perceive the individual frame instead. The higher numbers are for recognition - for example, they show a fighter pilot a single frame of a silhouette, and ask what they saw.

Human vision is a mix of complex processes performing different tasks, and not all of them have the same "rate".
< >
Showing 1-15 of 15 comments
Per page: 1530 50

Date Posted: Feb 19, 2024 @ 10:00am
Posts: 15