Ez a téma zárolásra került
help me understand the dumb "human eyes can only see..."
What's the point of this debate? We're treating the eyes as if it's a camera. It's so much more complex than that. Saying human eyes can only see 24FPS, so therefore 120Hz is useless is the most idiotic statement. Truthfully? It's ♥♥♥♥♥♥♥♥ to even say human can see FPS. We do not percieve thing in FPS. It goes much deeper than that, right?
< >
1630/63 megjegyzés mutatása
dvd for example has maybe 24 hz. there is no motion blur effect...there is mpeg2 .. the thing with mpeg2 is that a digitalized tv signal is working it out. mpeg means that in that videosignal only things that change in the picture are produced...so what doesnt change remains. thats digital stuff for having less megabytes per second on the dvd: longer movies on one disc.
It is false, there is no evidence that human eye can only see limited FPS.
mpeg2 uses lots of tricks to save space in the files, nothing to do with motion blur
there are still key frames with full data, and its told to copy/move sections to new location to create the next frames, no blur involved unless its built into the key frame and copied
_I_ eredeti hozzászólása:
Talby eredeti hozzászólása:
I think we can safely say, 24 FPS would be the minimum rate where the eye sees motion as fluent, and anything lower would be detected more easily.
even thats a stretch
with motion blur effects 24fps may look smooth
blink a led at 24hz and you can see its blinking
exactly what I was saying, sort of

https://www.youtube.com/watch?v=aU7RTluN69Q

not the same watching a video of it compared to seeing it live, and poor wording on my part - maybe "anything lower would be much more ovbious and painful" lol
lol this was on the playlist

(**loud annoying music warning**)
https://www.youtube.com/watch?v=Fkzl_FwUtds

he made the tutorial way too long, all it needs is the code
you can see the 'L' led is blinking (its already tied to d13 (could also use LED_BUILTIN if the arduinos led wired different) with onboard resistor)
/facepalm
Legutóbb szerkesztette: _I_; 2020. febr. 26., 8:03
perfect15 eredeti hozzászólása:
dvd for example has maybe 24 hz. there is no motion blur effect..
Yes there is, because the blur is in the actual footage. Any boy above a certain age discovered that ever since we've been able to pause betamax/vhs/dvd/tv/blu-ray.
Washell eredeti hozzászólása:
perfect15 eredeti hozzászólása:
dvd for example has maybe 24 hz. there is no motion blur effect..
Yes there is, because the blur is in the actual footage. Any boy above a certain age discovered that ever since we've been able to pause betamax/vhs/dvd/tv/blu-ray.

Not to mention hz != fps. The TV will still be running at it's hz regardless of how many FPS the media is rendering. That 24fps DVD is still being displayed across 60 screen refreshes a second on a 60hz TV.

There's also differences in how things are displayed across screens. A 60hz CRT provides a different experience than a 60hz LCD. Where one you can perceive flicker and the other has none.

The Hz value doesn't imply visual consistency across all possible variations of all display types.
My experience is that you can see the difference a lot better when u go to a higher fps or resolution, and then go back to what you were used. You can see it is smoother or better resolution, but it really is a lot easier to see how much difference there is if u can see the lower fps or resolution after u get used to the new fps or resolution u wanted or have.

I played on console all my life, so we jumped in resolution quality for the most part, when I came to PC I could see how insanely better 4k graphics are than 1080p or 1440p. I can also see the difference in fps from 30-60fps...much more so when I played my PS4 or PS4 Pro. then jump on my PC. I don't personally care for anything more than 4k@60 fps...resolution is much more important to me than having super high fps...all I need is 60.

Since, on console, the fps wasnt improving for me as much as graphic resolution was with each new generation...thats what I got used to, and 60 makes things so smooth for me, I can still see the soap opera affect when playing games in 4K@60fps...So more than that is a waste of money to me...its a personal preference thing, which one u find to be more beneficial to your gaming experience.

I see fps and resolution like this...to play lower resolutions and higher fps would be like running 480p at 120-240fps as compared to 1080p@60 fps...I would rather have 4k@60 than 1080p@120-240fps.
Red™ eredeti hozzászólása:
It goes much deeper than that, right?
Yep. Eyes are analog devices, not digital. They can't see in fps. The correct question sounds somewhat like "how many FPS can human's eyes and brain make sense of", but the answer depends on a lot of things. Eyes can easily perceive something appearing for 1ms, but most of the stuff we see gets filter out by brains as unimportant. But if one's life depends on seeing lots of small details in a very short period of time - the brain works dozens times harder than usual and does what is not possible under usual circumstances. One can also be trained to see more than others.
Brockenstein eredeti hozzászólása:
...visual consistency across all possible variations of all display types.
exactly my point with the LED clock, when it is starting up / going slow, the suspended led image is not as "fluent" aka you can see the swipe before the image is completed.

As it is going faster, it appears to be a solid suspended image.

Is it 24 FPS? ¯\_(ツ)_/¯
perfect15 eredeti hozzászólása:
Aliquis Freedom & Ethnopluralism eredeti hozzászólása:
Human eyes definitely can not only see 24 or 60 fps.

The reason for 24 fps originally was that one decided 48 fps was enough for film but then didn't wanted to use that much film so 24 fps it become. Supposedly. 50 and 60 fps for TV is from the AC electricity being run at 50 and 60 Hz and being used as a clock.
Film has motion blur. As in the shutter isn't open for an infinitively short time it's open for a while and if things move they become blurry which help connect the images.

IMHO motion blur in games just seem unclear and laggy.
i would say it is a kind of codec thing...mpeg2 is for movies and pc gaming is kind of/sort of rendering with polygons...totally different technique. imo
MPEG 2 is a garbage video codec and you don't have to render polygons in games you could just place pixels/sprites and or even draw vertexes if some display did draw such things. Pretty irrelevant.

With film you have to expose the film to light to actually get some result. That mean that if you have any motion of the scene and subjects you are filming, the camera or the film while the film is exposed to light it will get blury. Now they may film digitally at-least TV but you'll still let light hit the sensor for whatever amount of time resulting in blur.

For the computer if you just draw a scene without any blur effect it will of course be perfectly sharp and then transitions between every image may have less of a connected feel.

If your comment was regarding motion blur then yes the lossy video compression formats have a harder time delivering a sharp and perfect image if there's lots of changes in the scene so it will look blurrier / more like ♥♥♥♥ in such situations than when things are still. The polygon thing still doesn't make any sense unless your point is that simply drawing the polygons will result in a sharp image whereas with added motion blur it will be less sharp.
Legutóbb szerkesztette: Aliquis Freedom & Ethnopluralism; 2020. febr. 26., 13:28
I have a 240hz 1080p monitor, and many times people say to me why you could be running 1440p it looks much better and you cant see more than 60fps anyway, i say to them i have tried both, and yes 1440p look sharper, but there is more blur when you suddenly 180 spin cos you hear a noise behind you, and thus less image clarity, its really just personal i prefer more fps over sharpness as to me the game feels smoother and causes my eyes to hurt less, if i try to play 60fps i get headaches really quickly anything over 120fps no headache at all and the higher i go in fps the longer i can play with less fatigue, so in the end its all personal to each individual, dont get me wrong if i could play at 1440p at 240fps i would lol.
I've mentioned this before in another similar topic, but will repeat myself...

The human eye doesn't see in FPS (Frames Per Second). That is entirely a myth.

However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?

Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.

It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.

You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.

FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.

You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.

---

Lets say you have a 1920x1080 resolution monitor. Animate an object to go from the left to right of the screen at the speed of 200,000 pixels per second.

1080 / 200,000 = 5.4 ms
1000 / 5.4 = 186 FPS

You need 186 FPS to consistently see it on your screen.

However, if you had a 60Hz monitor (supporting up to 60 FPS) only, then 66% of the time you won't even notice that object flash by the screen. However, if your monitor could fully support those 186 FPS, then 100% of the time, your human eyes will pick up on that object. Even if you doubled or tripled that speed. Faster the object, the more FPS required, but your eye still detects it as it doesn't care the slightest about FPS, you just require more depending on the animation speed to be fully taken and processed.

Leaving lots of frames out on the other hand, leads to what is known by some as Cinematic FPS. This is where the brain starts to understand there is missing details and actually fills it in with imagination. It's forcing your own brain to make it up or just completely ignore and discard it, which some consider to feel more realistic, as the brain can develop images much better than a screen ever could. This is why people suggest the low ball 24/30 FPS locks. Personally my brain just gets ♥♥♥♥♥♥ off and annoyed by noticing it too much.

Hense why younger gamers with quality healthy eye sight want higher refresh rates.
Legutóbb szerkesztette: Azza ☠; 2020. febr. 26., 14:39
Azza ☠ eredeti hozzászólása:
I've mentioned this before in another similar topic, but will repeat myself...

The human eye doesn't see in FPS (Frames Per Second). That is entirely a myth.

However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?

Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.

It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.

You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.

FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.

You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.

---

Lets say you have a 1920x1080 resolution monitor. Animate an object to go from the left to right of the screen at the speed of 200,000 pixels per second.

1080 / 200,000 = 5.4 ms
1000 / 5.4 = 186 FPS

You need 186 FPS to consistently see it on your screen.

However, if you had a 60Hz monitor (supporting up to 60 FPS) only, then 66% of the time you won't even notice that object flash by the screen. However, if your monitor could fully support those 186 FPS, then 100% of the time, your human eyes will pick up on that object. Even if you doubled or tripled that speed. Faster the object, the more FPS required, but your eye still detects it as it doesn't care the slightest about FPS, you just require more depending on the animation speed to be fully taken and processed.

Leaving lots of frames out on the other hand, leads to what is known by some as Cinematic FPS. This is where the brain starts to understand there is missing details and actually fills it in with imagination. It's forcing your own brain to make it up or just completely ignore and discard it, which some consider to feel more realistic, as the brain can develop images much better than a screen ever could. This is why people suggest the low ball 24/30 FPS locks. Personally my brain just gets ♥♥♥♥♥♥ off and annoyed by noticing it too much.

Hense why younger gamers with quality healthy eye sight want higher refresh rates.

How does the eye consider what it sees as normal? you mean the brain?
emoticorpse eredeti hozzászólása:
Azza ☠ eredeti hozzászólása:
I've mentioned this before in another similar topic, but will repeat myself...

The human eye doesn't see in FPS (Frames Per Second). That is entirely a myth.

However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?

Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.

It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.

You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.

FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.

You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.

---

Lets say you have a 1920x1080 resolution monitor. Animate an object to go from the left to right of the screen at the speed of 200,000 pixels per second.

1080 / 200,000 = 5.4 ms
1000 / 5.4 = 186 FPS

You need 186 FPS to consistently see it on your screen.

However, if you had a 60Hz monitor (supporting up to 60 FPS) only, then 66% of the time you won't even notice that object flash by the screen. However, if your monitor could fully support those 186 FPS, then 100% of the time, your human eyes will pick up on that object. Even if you doubled or tripled that speed. Faster the object, the more FPS required, but your eye still detects it as it doesn't care the slightest about FPS, you just require more depending on the animation speed to be fully taken and processed.

Leaving lots of frames out on the other hand, leads to what is known by some as Cinematic FPS. This is where the brain starts to understand there is missing details and actually fills it in with imagination. It's forcing your own brain to make it up or just completely ignore and discard it, which some consider to feel more realistic, as the brain can develop images much better than a screen ever could. This is why people suggest the low ball 24/30 FPS locks. Personally my brain just gets ♥♥♥♥♥♥ off and annoyed by noticing it too much.

Hense why younger gamers with quality healthy eye sight want higher refresh rates.

How does the eye consider what it sees as normal? you mean the brain?

Yeah, well the eye to brain signal.

If each eye was given two slightly different images, it can assume it's 3D as well. The brain works it out as such from the limited data it's provided from each eye.
Legutóbb szerkesztette: Azza ☠; 2020. febr. 26., 14:44
< >
1630/63 megjegyzés mutatása
Laponként: 1530 50

Közzétéve: 2020. febr. 25., 15:24
Hozzászólások: 63