Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
there are still key frames with full data, and its told to copy/move sections to new location to create the next frames, no blur involved unless its built into the key frame and copied
https://www.youtube.com/watch?v=aU7RTluN69Q
not the same watching a video of it compared to seeing it live, and poor wording on my part - maybe "anything lower would be much more ovbious and painful" lol
(**loud annoying music warning**)
https://www.youtube.com/watch?v=Fkzl_FwUtds
he made the tutorial way too long, all it needs is the code
you can see the 'L' led is blinking (its already tied to d13 (could also use LED_BUILTIN if the arduinos led wired different) with onboard resistor)
/facepalm
Not to mention hz != fps. The TV will still be running at it's hz regardless of how many FPS the media is rendering. That 24fps DVD is still being displayed across 60 screen refreshes a second on a 60hz TV.
There's also differences in how things are displayed across screens. A 60hz CRT provides a different experience than a 60hz LCD. Where one you can perceive flicker and the other has none.
The Hz value doesn't imply visual consistency across all possible variations of all display types.
I played on console all my life, so we jumped in resolution quality for the most part, when I came to PC I could see how insanely better 4k graphics are than 1080p or 1440p. I can also see the difference in fps from 30-60fps...much more so when I played my PS4 or PS4 Pro. then jump on my PC. I don't personally care for anything more than 4k@60 fps...resolution is much more important to me than having super high fps...all I need is 60.
Since, on console, the fps wasnt improving for me as much as graphic resolution was with each new generation...thats what I got used to, and 60 makes things so smooth for me, I can still see the soap opera affect when playing games in 4K@60fps...So more than that is a waste of money to me...its a personal preference thing, which one u find to be more beneficial to your gaming experience.
I see fps and resolution like this...to play lower resolutions and higher fps would be like running 480p at 120-240fps as compared to 1080p@60 fps...I would rather have 4k@60 than 1080p@120-240fps.
As it is going faster, it appears to be a solid suspended image.
Is it 24 FPS? ¯\_(ツ)_/¯
With film you have to expose the film to light to actually get some result. That mean that if you have any motion of the scene and subjects you are filming, the camera or the film while the film is exposed to light it will get blury. Now they may film digitally at-least TV but you'll still let light hit the sensor for whatever amount of time resulting in blur.
For the computer if you just draw a scene without any blur effect it will of course be perfectly sharp and then transitions between every image may have less of a connected feel.
If your comment was regarding motion blur then yes the lossy video compression formats have a harder time delivering a sharp and perfect image if there's lots of changes in the scene so it will look blurrier / more like ♥♥♥♥ in such situations than when things are still. The polygon thing still doesn't make any sense unless your point is that simply drawing the polygons will result in a sharp image whereas with added motion blur it will be less sharp.
The human eye doesn't see in FPS (Frames Per Second). That is entirely a myth.
However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?
Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.
It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.
You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.
FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.
You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.
---
Lets say you have a 1920x1080 resolution monitor. Animate an object to go from the left to right of the screen at the speed of 200,000 pixels per second.
1080 / 200,000 = 5.4 ms
1000 / 5.4 = 186 FPS
You need 186 FPS to consistently see it on your screen.
However, if you had a 60Hz monitor (supporting up to 60 FPS) only, then 66% of the time you won't even notice that object flash by the screen. However, if your monitor could fully support those 186 FPS, then 100% of the time, your human eyes will pick up on that object. Even if you doubled or tripled that speed. Faster the object, the more FPS required, but your eye still detects it as it doesn't care the slightest about FPS, you just require more depending on the animation speed to be fully taken and processed.
Leaving lots of frames out on the other hand, leads to what is known by some as Cinematic FPS. This is where the brain starts to understand there is missing details and actually fills it in with imagination. It's forcing your own brain to make it up or just completely ignore and discard it, which some consider to feel more realistic, as the brain can develop images much better than a screen ever could. This is why people suggest the low ball 24/30 FPS locks. Personally my brain just gets ♥♥♥♥♥♥ off and annoyed by noticing it too much.
Hense why younger gamers with quality healthy eye sight want higher refresh rates.
How does the eye consider what it sees as normal? you mean the brain?
Yeah, well the eye to brain signal.
If each eye was given two slightly different images, it can assume it's 3D as well. The brain works it out as such from the limited data it's provided from each eye.