Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
If you get a game that can run at higher FPS, you will get frame lag and screen tearing from a monitor that can only actually handle lower frames. Even though it might show that you're running higher FPS on say a 75Hz monitor, you actually aren't. The monitor will only render the frames it's capable of literally one Hz cycle per frame so 75hz =75 FPS without pixilation or a tearing effect on the screen. If your graphics card is pushing out more frames than the monitor can handle, the monitor will attempt to show the next frame before a previous frame finished cycling, this is where the tearing effect comes from.
It's the variation of FPS and the sharpness of edges moving which is the problem.
A health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?
Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.
It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.
You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.
The sharper quality the image edges are, the more the eye will become to notice and become annoyed by it.
FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.
You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.
---
Now getting down to the monitor - while the graphic card varies FPS depending on action/idle, the monitor itself displays them at a fixed rate.
Note: Hz and FPS are two different things, coming from different devices...
Hz - how many times your screen draws per second. This is purely a function of your monitor.
FPS - how many times per second that your computer is building a frame (a picture for your monitor to draw). This comes from your graphics card.
So the monitor is limited in refresh Hz to how much it can display.
Even if your running at 60 FPS on the graphics card, the monitor at 120Hz or 144Hz will just hold those frame for the same as a 60Hz. If V-Sync is enable it can cap this to an even flow. You would want to cap it to either 30FPS, 60FPS, or 120FPS (depending on what the graphics card can handle up to without dropping below). If it was 30FPS on a 60Hz monitor, they would just be held for a split second longer and the human eye won't be so annoyed.
144Hz can handle up to showing 144 FPS every second. Perhaps your graphic card it running around 60-89 FPS? It would therefore look better than a 60Hz monitor and more relaxing/smoother on the eyes. The rest will be filled in with holding those frames just slightly longer (still not as long as a 60Hz would).
The difference is FPS vary. Therefore some might clock (v-sync) the FPS to a set limit. Say your graphic card is producing 60-89 FPS - it could lock it to a set 72 FPS, which each frame is then held for a split second each on the monitor (72x2 = 144). Understand? That's still better than what a 60Hz monitor can handle.
A few gaming monitors also have a technology known as G-Sync built in which syncs Nvidia graphic cards FPS with the monitor Hz, making it as smooth as ever! Without that frame lock.
(Note: This is Nvidia only technology)
http://www.geforce.com/hardware/technology/g-sync/videos
---
So to answer the question: 75Hz vs 144Hz... yes and no... If your graphic card(s) is generating more than 75 FPS, perhaps a 144Hz would appear smoother to a razor sharp eye of a youth, specially if it had G-SYNC. However, most will go by unnoticed otherwise. It comes down to what your graphics card can handle... the 144Hz will just allow it to display more if it can manage.