Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
A movie can stay at the same frame rate continuously.
PC or Console games however don't due to processing and graphic card performance, frames might increase and drop.
There is however a limit to where the human eye notices the motion isn't valid, at lower FPS, when the image edges are sharper / higher quality. The eye detects jumping.
It's also like with wearing glasses. If the human eye has sometime to compare against, it will be much more noticable. Going back looks worst, but sticking with the same - the eyes adapt and accept it as normal.
But low FPS such as 24 FPS is selected to put stress on the human brain. It can detect things are missing and therefore the brain actually fulls in the missing data with it's own imagination. This is why it's called the movie magic FPS and some PC games even try to do it annoyingly, with the ingame videos.
On the otherhand - The higher the FPS, the less this jumping is noticed and the more the eye relaxes and accepts it. This is why when an user see 120Hz/144Hz monitors running with a high-end graphics card (high FPS), they swear by it and won't go back. 60Hz appears as rubbish to them (due to comparison).
Therefore:
Movies can get away with 24FPS
Consoles on TV try to get away with locked 30FPS
PC has 3x the monitor quality and aims for 60FPS (varies) or higher
In a way yes... It's like your vision getting bad, the human brain doesn't really notice, till you get tested and given glasses or something which makes it better. Suddenly it has something to compare against. Removing those glasses, the human brain will make the original vision appear much more blurry/worst for a while, till it readjusts back or you wear your glasses again. If you get a 144Hz and compared it side-by-side against a 60Hz monitor, you will clearly notice the difference, compared to just working with 60Hz all your life.
There's also various aspects however to cheat the FPS.
The human eye will detect the flipping of frames, specially more at lower FPS, but learns to ignore it. It appears strain, which Thomas Edison said that 46 frames per second was the minimum need by the visual cortex: "Anything less will strain the eye". When looking at a lighted display, people begin to notice a brief interruption of darkness if it is about 16 milliseconds or longer. Monitors and TVs these day have backlighting to help prevent this. PC LED monitor might have a LCD backlight, for example, making it less strain on the eyes.
A sharper, cleaner edged image will require more FPS to appear smoothly animated - this really annoys the eye greatly (pc monitors have higher quality graphics, anti-aliasing, etc, and therefore need higher FPS). A blurry edged image however can use much less FPS and still appear smooth. An object moving across the screen is probably the most noticably, you will see the jumping in lower FPS. TVs (game consoles) use a motion blur to hide this fact.
24 vs 60 FPS:
http://i.imgur.com/xnGbDts.gif
Usually. I think the differences are less now that consoles and televisions are progressing and becoming basically PCs and Monitors.