安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
I guess you embranced "motion flo" video interpolation garbage on your tv too;)
Not understanding why things are just leads to false conclusions on why things are supposedly "better" when they actually just feel "gross" to many people, something better shouldn't be off putting like that. Film isn't about reproducing reality, its why colors are deliberately skewed in film using color grading.
https://www.youtube.com/watch?v=pla_pd1uatg
http://www.digitalcinemafoundry.com/2010/04/02/why-the-so-called-blockbuster-look-color-grading-explained/
Its not about a slavish attempt to reproduce reality but an artistic creation which feels real, even when its clearly not.
When you watched the 60fps version, you could clearly see the objects going by even though the camera was moving at high speed. It feels "hyper-real" in a strange way, since in reality, when objects pass by that quickly they blur because our eyes recieve light at a certain fps or something like that (I forget the science behind it so don't quote me, just look into it yourself)
Anyway, the idea is when they record these movies they record it in a way that objects moving so quickly get blurred in a natural way, and the 24 fps lock is part of that (or the cause of it, I forget if there are other elements involved to providing this effect in production... (should probably read that up myself. hell I should probably be looking this ♥♥♥♥ up and posting it at the same time so there's a "mutual learning experience" but I'm also a lazy SOB.)
Skimming through the other posts I know this one is pretty blah and not quite as detailed/pro, but just to share something I experienced online that is very much related and may clear the air a bit more about your curiosity. I am in no way a "film expert" of any kind nor am I one of the massive amounts of people who are trying to become youtube-gamer-famous so...yeah. Just going by something I read/looked up a while ago while aimlessly learning things on the internet.
I never said I preferred the higher frame rate because it is more "realistic", and fully aware of the tension between reproduction and representation in photography and cinema:
http://www.jstor.org/discover/10.2307/1343119?uid=3738032&uid=2&uid=4&sid=21104336399687
I prefer the faster frame rate because fast moving scenes in 24fps is notably choppy for me, languid looking and the flicker really takes me out of the immersion.
And if you really don't think there is a difference...
http://boallen.com/fps-compare.html
You can see that 30 fps is ok, but when you scroll down and compre it to 60 fps, there is an obvious difference.
This is also pretty cool: http://frames-per-second.appspot.com/
this is why i abandoned this thread :P beause people cant read.
i know the difference between fps, i wasnt asking for that. i was asking about the fps in FILMS AND TV
Lower than 24FPS appears laggy due to swap, but depending on each images sharpness. If the image edges where blurred a bit, it wouldn't matter. You can however get a flickering effect on the human eye with sharper image quality - it can then understand it's not real movement and get annoyed with it.
30FPS continuous appears fine to the human eye, but only if continuous.
60FPS in games might vary a little due to the PC and Graphic card processing, but the higher the FPS, the less the human eye detects major frame rate changes.
Say it's flucuating between 24-30FPS - laggy to the eye.
However, 48-60FPS appears less laggy, but still detectable. Younger the human eye, the more it tends to notice and be annoyed with it.
89-120FPS appears unnoticed to the human eye. Even with a larger range of frame change.
So the higher the rate, the more it's allowed to change and still appear smooth.
How many frames per second can the human eye see?
Is a trick question and has much confusion around. Remember that the eye doesn't work in FPS.
It's more a question of "How many frames per second do I have to have to make motions look fluid?", "How many frames per second makes the movie stop flickering?", and "What is the shortest frame a human eye would notice?"
Blurring simulates some fluidity, sharpness simulates stuttering. Therefore a TV blurring also requires less frames than a razer sharp PC monitor. So the situation actually changes depending on the device.
Then you have monitor/screen refresh rates - 60Hz is ideal for 60FPS. At 30FPS, it will be showing the same image twice, which is still fine, so long it's continuously the same rate all the time. However, that is most likely going to change, such as during hardcore or fast pace action, therefore the eye gets annoyed and spots the difference.
The human eye can detect light change within 1/200th of a second or less, therefore your sub-consisous is still aware of blackness/flicker between the frames but disregards it. To not see the blackness between the frames about 70-100 fps is required at least. This is why most people who go 120Hz monitors with high-end graphic card producing 120FPS find it so much better sub-consisiously, when they look back at lower, it appears wrong and slow to them. But the other guy who hasn't seen 120Hz/120FPS doesn't mind it at all.
For example: If you haven't worn glasses, but still need them. You will be unaware and completely at ease with your current vision. Not knowing how badly your eye sight is until you wear those glasses - then when you remove the glasses again, the vision appears a lot more blurry. It's the same type of effect. The human eye adapts to what it's most use to.
Confused yet?