Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
Then there is also the matter of preference. Someone who is really into graphics might prefer a higher fps. People who don't care as much, generally don't have an issue with a lower fps.
Also, there are different results when it comes to "scientifically proven". Most experts say the human eye can see somewhere between 30 and 60 fps.
You can always use the ufotest to see the difference in fps and how your eyes see it.
https://m.youtube.com/watch?v=OX31kZbAXsA
It's roughly 14 frames per second at which most people stop distinguishing individual frames but perceive continuous motion. And for most the perception of that motion starts feeling more and more natural the higher you go. Roughly until you hit 90~120Hz. Within that range it becomes a game of "red wine is red wine."
Yes; fighter pilots have been able to see things for as short as 1/250th of a second, i.e. were able to perceive 250Hz. Except that's a load of crock. There's a principle in human vision called Bloch's law which states that for any flash of light below 100ms, human vision cannot accurately distinguish between a short and bright flash or a longer more dim flash. What fighter pilots actually see is a very bright flash like muzzle flash which if you look up the specs of the weaponry or record it with high speed cameras indeed lasts 1/250th of second, but they perceive it more dim than it actually is and over a longer period of time. They experience a kind of temporal low-pass filter and fusion of information. They still saw a flash; but no way they accurately saw it for that 1/250th of a second it actually happened.
Bloch's law is a hard unsurpassable biological limit. So that's myth busted right there.
Moving on, as it relates to video gaming, and in particular to action games and first person shooters, higher frame rates only assist you in that motion feels more comfortable. They do not actually make you more accurate. That is hog wash as well.
So this:
is wrong.
For purposes of target acquisition and motion tracking an acquired target, we use a very specific part of the human brain which works steadily between 7 and 13 Hz. This activity is so steady that it actually shows up on EEG readings as a steady wave. At a maximum of 13 times per second our brain samples whatever's there visually and extrapolates and predicts target movement based on old and new state.
Anything above ~20Hz has no added value of fidelity for target acquisition and tracking.
There have also been repeated experiments with respect to minimal motor response time given a certain visual stimulus. For the average healthy person, that response time is 200~220ms.
And that is for users anticipating a certain known object to appear in a particular fixed position. Factor in our brains actually having to identify the thing flying in on screen and having to make a rational decision about how to handle it, and you're closer to 400~500ms.
On 30Hz you get a frame update once every 33.3ms
On 120Hz you get a frame update once every 8.3ms
30Hz lags behind by 25ms compared to 120Hz. Which is about one-eighth the average human's absolute minimal reaction time. I.e. is statistically irrelevant.
Yes. CRT era 30 frames per second was actually 60Hz with 60 fields per second update. One field-update for the even lines; the next for the uneven lines; etc. etc. This, along with the way a CRT has a sort of 'afterglow' gave a much smoother and more blended impression to the eye. In contract LCD technology uses sample-and-hold which is more susceptible to noticing judder (discrete frame steps).
Especially for those that have eyes more capable at distinguishing high frame rates. (i.e. are closer to the 120Hz than to the 90Hz, where everything becomes a mash.)
Whats actually been proven is that the human eye can detect movement into thousands of FPS.
Im old and no fighter pilot and I can easily see the difference between 60 and 100, or 90 and 144.
Ya, the higher up you go the less perceptible it is, but to say no one can see/detect those changes is laughable.
I can def see graphical changes, when some of these devs strive for a higher framerates, and it seems to dumb down the graphics.
I have Ace Combat 6, and tried the new Ace Combat, and the older looks better than the newer, being the devs i think have to dumb down lighting, and atmospheric effects to get to the desired framerate.
I'd much rather have cinematic like graphics, then dumbed down ones at a higher frame rate.
it is mostly a "you-thing".
You can perceive one frame rate is higher than the other, yes. But experiments have shown that people are very bad at actually quantifying the difference, especially if it is minor.
That's likely because quantifying the difference relies to a degree on being able to discern individual events, and below 100ms everything gets chucked through a kind of temporal low-pass filter where it gets fused and muddied together.
This temporal fusion has even been tested with experiments that would show people a red and a green disc in quick succession. Above certain frame rates, people would stop seeing two individual differently colored discs and would actually see one yellow disc.
If one would show alternating images of an offset vernier and anti-vernier (mirrored orientation) in rapid succession, then people would perceive one straight line. I.e.
alternated with
is perceived as
These experiments were done as early as the late 60s and were redone across the 70s and 80s; each time leading to the same type of result: our brains image perception has a temporal low-pass filter which just blends things together.
That's the gist of it really.
The "bigger number, so must be better" mentality of the modern gamer-consumer plant is what drives these frame rate 'innovations' -- not the actual improvement they bring.
Actually true. The biology of the photo-receptors in the human eye guarantees that we're only receiving new input at roughly 60 Hz maximum. (Can be lower if the eyes are tired.)
The bulk of what we see is what the brain makes of things. It can interpolate and extrapolate additional data which it thinks would have to have been there. (This plays a part in a magician's sleight of hand as well, wouldn't you know...)
Actually, we'd probably not be far off if we'd consider the brain to run its own version of DLSS 3.0 ...
The tests ran well over 100 FPS back then, very expensive too.
So think the whole, eyes can't tell the difference between Hz and FPS a fallacy. Even I can the difference in motion and tracking when go to 240 FPS vs 120 when gaming. I just lock it to 120 FPS in RTSS to reduce heat on the GPU so less fan noise.
However the returns diminish quickly, and in some cases can have negative effects since it removes and reduces some of the artifacting we're used to.
THe jump from 15-30 is immediately noticeable. From 30 to 60 is less noticeable, , and the jump from 60 to 120 is even less noticeable. And anything above that likely won't register at all.
It also depends on how the animations are done. If the game is designed around say 30fps thyou're likely not going to notice any difference between it and a game running at 60fps
So with 60 fps everything is pretty slow and any extra change is useful, especially if you play a shooter. In a shooter, depending on the game, a moving enemy can be up to 20 head sizes away when moving from left to right on your screen from where you actually click the head. So professional gamers had stuck to CRT screens until faster flat screens came out.
As a former CRT screen user, i Immediately noticed the difference on any flatscreen PC, so that to me it seems like 3D games start turning around once I'm already done making my mouse movement. (not to mention the color difference, wherein I used several blue tones for websites, but they all looked the same on flat screen ;) )
I'd say from 30 to 60 is still a big difference. When I recently played Wo Long: Fallen Dynasty which has performance issues and it dropped to 30 FPS. Boy was that a day and night difference.
Even going past 60 FPS is a noticeable difference. But after 120 it might be harder to detect.