Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You may not see a flicker at 30 fps but I find it completely unacceptable. Anything less than 60 fps is not really playable for me. I can tell the difference between 60 and 120 fps too. It is different from person to person.
If you don't see the difference, then the expense of high end hardware may not work for you. Again, it is very subjective and personal. You must decide for yourself.
It's actually around 70~fps that the "average" human can see where it is smooth visuals. For the average human, higher might give you a little extra smoothness to the visuals, but you'll get progressively less reward the higher you go. At higher than 144fps, you'll see little if no difference from what you saw at 70.
Some people have a higher perception, and can see a difference at higher framerates. They might see "choppiness" where others would not, and closer to 100 is where they can no longer see a difference
And likewise, some see no benefits to anything above 60fps.
Edit; Ofcourse, if your computer can't keep it at a steady rate, you'll always have choppiness as the framerate fluctuates about. That's not an eye problem though.
Starfield is capped at 144, unless you uncap it, and then it's free to hit whatever manual cap you set.
Too low/choppy gameplay can give people eye strain. Eyes are most certainly worth the pennies IMHO.
Everyone says it. Yes i can this! It's not even possible but i can see it..
When i take a Game and limit it to 40 and say it's 130..... You believe me. Cause you wanna see what you want to see! And you see a difference to another test where i say its 30..... Cause you want. This is like our brain works :). Everyone taste the cheap beer.... It's all the same :)
Perhaps if you're running 30 FPS, you can get used to it. But if you're running 60 FPS, and then it drops to 30 FPS, you will notice it.
Now i am with 3840x2160 and 100Hz/FFPS and it is much better but far from perfect. My graphic card has to deliver now much more pictures in a second and i do not know/feel that the power consumption is realy woth it.
The 30 FFPS i got years ago with oblivion been about my "bad" Pc and the screen show felt a bit like a dia show and the 30Hz with my first 4k Tv felt fluide and enough pc power behind.
With more than 100Hz and my new Tv the colours became a bit bad so i took 100 Hz but is it woth it..hmm.
Well first humans can see up to around 60 FPS, so lots me included can't play at 30 FPS everything just looks like a slide show to us the difference between 30 and 60 most certainly is noticeable in most humans, 60 to 120 is where it starts to drop off and most people don't actually see a difference above 120 only a very few % of humans will claim they can see noticeable difference, these are the same people that claim they can see all the detail in "8k" which is just BS the difference between 4k to 8k yes on paper is great but we can actually see? at best you'd notice the very tiniest of subtle difference
you would't believe how many people i hear scream they can see 244 FPS or something silly then later find out their monitor is locked at 60 refresh rate (meaning its literally impossible for the monitor to show anything higher then 60fps Lol)
TLDR:
30 to 60 FPS is the most noticeable difference humans can see
"help me justify the easy thousand it will cost to get my next rig, plus ram sets and video card..."
ya the "start up" cost can cost you about $1,300. but remember you don't need to replace the whole machine ever x years like you do consoles so ten years down the line you might want a new GPU you can spend about $300 (or less if you shop around) and just upgrade that and carry on as opposed to having spend about $700 on new a console) add bonus you don;t lose access to your games! you will never have to worry about "oh i hope this console is backwards compatible"
alot of console player will tell you you have to update your computer ever few years... no you really don't if you always want the best of the best then yes.. but really you don't need to, a decent built computer from the get go can go a long way, so basically only update/upgrade when you have
and if you aren't doing "pre-builts" (and you really shouldn't) you can save hundreds of dollars on building your rig, avoid anything that says "gamer" and things with glass or leds congrats you just save about $600, example my computer case costs about $120 for the same exact case just with a "glass" door its $300. products marked with "gamer" is usually just normal middle range items slapped with 100s of led flashing lights to upsell for an extra $50 or so..
you could get away with a decent $1,100 build to run this game 60+ fps you most certainly won't be able to run it in 4k, but unless your part of the %1 that cares about that or have a monitor that even support 4k then who cares.
Your assumption about the human eye/brain only being able to perceive 30 FPS is the problem, in truth. Movies and TV have a traditional limit of 24 FPS - that's backwards technology simply carrying forward. What they do to make it appear smooth, is build blurring directly into the images. Pause a movie with a lot of action and you'll see the blur clearly.
PCs and games on digital monitors by comparison use full frames, which also rely on refresh rate. Over the last ~20 years that's been 60 FPS (30 for consoles due to more limitations in hardware). More powerful systems just in the last ~7 years go up to 144hz refresh rate, and are noticeably smoother than 60 fps for anyone using them. A lot of phones have that higher refresh rate, which is what makes their UI seem so responsive and clean.
With non-blurred images (ie games on a PC), 60 frames per second is a sweet spot between smooth motion, and exorbitant cost, which is why it probably still persists today as the norm.
The human eye can actually see over 144hz refresh rate, but the number of people actually able to discern the difference over ~100-120 frames per second is a lot more rare.
Hope that helps some.
https://en.wikipedia.org/wiki/Stroboscopic_effect
...we will perceive the 30fps as choppy. The higher the fps the less noticeable the effect becomes and your eye isn't able to distinguish between framerates at higher levels.
Ask a neuro scientist for in depth explanations.
I can definitely tell if IE i have vsync on or uncapped fps on certain games, without checking the settings or displaying the FPS.
https://www.linkedin.com/advice/0/what-pros-cons-high-fps-vs-low-gaming-skills-pc-building
Not the best article but might help.
Thanks. I think you're on to something.
This really answers my questions about the human side...
https://www.healthline.com/health/human-eye-fps
Looks like Hollywood propaganda lol about why they film and show 30 FPS is behind my original thoughts.
I recall the tales of Bruce Lee being too fast to film in the Green Hornet. It looked like he kinda flinched and thugs fell over around him, because he moved So fast the 24 FPS cameras/film couldnt really show it. SO they went to 30 FPS and yeah, he's still a blur.
Well, they're not beyond that on film. The camera mechanisms need that much time to work & the film has to roll thru the machine... so they're no faster still.
Digital cameras, sure. Higher is possible, but they need digital projectors in the theater. Which most do also have now.
As to our eyes, seems we top out at 60-ish? Maybe some people go 75 max fps ability to process images.
Perceiving flickering if something is shown to us at higher or desynched rates and we're one of the few who can perceive that flickering at a higher rate, is possible...
But since your eyes and brain are not able to resolve & process images at that rate, its a comfort thing. Ending flicker is not like you're able to see frames of images at 144 if only the game and system can show you... no. That flicker you Might see is all there is there.
Maybe locking in 60 FPS and V-sync on everything on screen stays rock solid? The rare ones with expensive rigs might get something out of locking the screen refresh rate to the FPS, just no flickering, but its really not possible to SEE and process that any better than the maybe 60-75 FPS human ability to process images. No headache or whatever you get from flickering is a goal, sure.
I'm just not willing to triple the cost of my rig for that when I can just lock rates and use Vsync.
Conclusion:
Short: I'm going to focus on setting 60 FPS selected everywhere and Vsync and call it good.
Long: Build a system I can upgrade in the future if I have any issues of perceiving flicker.
Thanks guys!
I shot you all some points for each positive helpful reply. Thanks again.
It's been proven by experiment that humans are capable of reliably identifying images seen for 1/240th of a second. Not just seeing that the image was there, but seeing it in sufficient detail to be able to identify it in detail. The earliest testing was done by the USAF. Test subjects were able to identify the model of plane, the insignia on the plane showing what country it was from, etc.
I'm not saying that 240 fps is needed for gaming, just that the claim that anything above 30 fps makes no difference is untrue. It's a myth. It's a myth that was debunked decades ago, but it still persists. The first video directly showing it that I recall was late 1990s, with two cubes being rendered side by side, one shown at 30fps and one at 60fps. The difference was noticeable and that was with the relatively crude 3D graphics of the late 1990s.
30 fps is more of a minimum than a maximum. The human brain is extremely good at visual processing and can do a remarkable job of stitching individual static images together to interpret them as a moving image, but there are limits. At 30 fps the illusion of movement is noticeably imperfect but mostly tolerable, generally. Not good, but probably playable. Especially for someone who hasn't played at a higher framerate.
Then there's the issue of effective update speed of the gameworld. The higher the framerate, the earlier the player will see any changes in the gameworld. That's extremely important in high speed games played competitively at a high level, which is why people who do that will trade even resolution for framerate.
I find that I can tell the difference between 75 fps (which is the framerate I'm used to) and 60 fps. But only just. Maybe if I was used to a framerate higher than 75 fps the difference between that higher framerate and 60 fps would be more noticeable to me.
Variable refresh rate is better than vsync unless you can guarantee your framerate will never go below the maximum refresh rate of your monitor or your monitor doesn't support variable refresh rate (it almost certainly will - VRR has been standard for a fair few years now). Also, locking your framerate to 60 fps and turning vsync on will be worse if your monitor has a refresh rate of anything other than 60 Hz.
Variable refresh rate is given different names by different sources:
Adaptive sync is the name given by the relevant standards body for the implementation of VRR over the DisplayPort connection standard. It's also often used for VRR as a whole.
Nvidia has a proprietary version they call G-Sync.
AMD has an open version they call FreeSync.
That's now the dominant standard and Nvidia uses it too, but Nvidia refuses to use the name and instead calls it "G-Sync compatible", which is a straight up lie because it's a different standard and not at all compatible. Nvidia created G-Sync to not be compatible, so they could copyright it and thus increase their power in the industry if it became the dominant standard. Truth doesn't matter in business.
Intel calls it adaptive sync.
Here's a link to a handy summary guide to VRR:
https://www.howtogeek.com/793199/what-is-displayport-adaptive-sync/
Human vision is a mix of complex processes performing different tasks, and not all of them have the same "rate".