Installa Steam
Accedi
|
Lingua
简体中文 (cinese semplificato)
繁體中文 (cinese tradizionale)
日本語 (giapponese)
한국어 (coreano)
ไทย (tailandese)
Български (bulgaro)
Čeština (ceco)
Dansk (danese)
Deutsch (tedesco)
English (inglese)
Español - España (spagnolo - Spagna)
Español - Latinoamérica (spagnolo dell'America Latina)
Ελληνικά (greco)
Français (francese)
Indonesiano
Magyar (ungherese)
Nederlands (olandese)
Norsk (norvegese)
Polski (polacco)
Português (portoghese - Portogallo)
Português - Brasil (portoghese brasiliano)
Română (rumeno)
Русский (russo)
Suomi (finlandese)
Svenska (svedese)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraino)
Segnala un problema nella traduzione
This is what I was going to say.
It could also be more footage (I think), because with 60 FPS you get 30 more frames every second than you do with 30 FPS. (I may or may not be talking out of my own ass)
Question: when does 30 work better?
They use it as an excuse for the weak-as-♥♥♥♥ console hardware that they'd love to say is weak-as-♥♥♥♥ but they can't because they'd stop getting all the $$$ from Sony/Microsoft.
A 1 litre container holds 1 litre of content and not more right?
So how is it that a footage of the same thing that starts at the same time and has the same duration, end with additional content?
For me yes, but the threshold of how noticable is different for different people. And even if you don't notice it, the game responds better when played at 60FPS.
Would be better in 60 FPS, I can guaruntee it.
More responsive, smoother gameplay, more immersive experience.
< 30 FPS - the difference in frame rates is huge.
30 FPS to 60 FPS - difference still quite large for younger viewers, specially on sharp image quality when rotating.
60 FPS to 120 FPS - difference is less, but still even noticable.
At this point, people using 120/144Hz monitors have their eyes opened and didn't realise how bad lower frames where till they have seen this to compare again. Smooth vs jiggery. It's hard for them to return and accept lower, without getting eye strain.
It's like growing up, over time your eyes grow use to what it sees and accepts that quality. However, when you wear glasses, everything suddenly becomes clearer. Then you remove those glasses and everything appears more blurry than before. Because the eyes at that point of time has something to compare against.
Eyes don't see FPS, they adjust to what they consider the best. Persistence of vision notices the frame flicker inbetween each, even for that split second. The brain then calculates it out of the equalation or gets annoyed with it. The greater flickering in close-up viewing is due to more of the screen being in the viewer's peripheral vision, which has more sensitivity to flickering. LCD/LED monitors has reduced this a lot with a backlight, but it's still there and noticable by many.
I'm tired of answering you because you have absolutely no grasp of logic or how video recording works. In other words, you can't be saved. Have someone else explain the obvious logic to you.
Well, that is also what I am asking for. For someone to actually explain the obvious instead of repeating the same thing about how I am just wrong. It is easy to simply say "Hurr I am right, you are wrong". But there is a reason that should follow, the "why".
I already told you why, it's because the gameplay footage is recorded twice in different times.
It's probably because I am
The reason why is most likely because it's not the same video, but a recording of the same action done again, with the different game setting (30fps increased to 60fps). I've counted the frames between them and everything, gave you all the stats from my VLC video player. I will agree they are off by a bit, but that shouldn't affect the end result too much.
There we go. So, finally you agree on that.
Now, while the change in fps may not be that clear to my eyes, such things are. You claim that it is negligible, but the change in the "end result" because of that is very much noticeable for me.