All Discussions > Steam Forums > Off Topic > Topic Details
Polyphemus Nov 26, 2016 @ 5:37pm
I Can't See The Difference Between 40 FPS And 60 FPS
When playing games, or watching movies, if my FPS drops to 15, it's obvious. If it drops to 30, it's just about (maybe) noticeable. Beyond 40 FPS, I can't see any difference. I've tried 144 Hz monitors, and they look no different from my 60 Hz.

For me, the same is true of 4K. Sitting 2 feet away from a 27 inch monitor, I can't see any difference between 4K, and 1920 x 1080.

I really do think the whole increase in refresh rates and screen resolution is a con job by the industry to force us to buy new gear. There are real limits to human perception, and we've gone beyond them.
< >
Showing 1-15 of 36 comments
Ryuu. Nov 26, 2016 @ 6:44pm 
Same for me, guess whe are part of a small group who are blessed with not giving a ♥♥♥♥ :)
Darren Nov 26, 2016 @ 6:46pm 
There are people that can tell the difference but the vast majority of people can't or can but don't care.
Teksura Nov 26, 2016 @ 7:05pm 
Originally posted by Toadflax:
I really do think the whole increase in refresh rates and screen resolution is a con job by the industry to force us to buy new gear. There are real limits to human perception, and we've gone beyond them.
You are correct for the vast majority of cases. This is the same discussion point that I had back when I was selling HDTVs. Based on normal viewing distance, you really can't distinguish any difference between a 720p and a 1080p set unless you get into the really big sets. you have to get real close to see the difference, or you need a larger screen. HOWEVER, there ARE cases where the extra resolution does matter. For example: I use my TV as my computer display, and whenever I take my computer elsewhere and plug it into a 720 set I notice the difference, and it actually bothers me.

So is the whole thing about bigger and better numbers a conjob? Well, in many cases yes. But like my example above, there are always going to be some fringe exceptions to the rule.



It should also be noted that this is not something exclusive to the computer industry. you see a lot of this in other industries, like the auto-industry. The big thing with trucks for example is often to talk about how much power they have and how much they can haul. Many of them do this just for marketing, when the reality is that all that power is WAY excessive for most cases.



This is how a company upsells you. they present something which is better on paper and depend on people not asking the question of how much is actually necessary.
The Ronnie Rager Nov 26, 2016 @ 8:46pm 
I disagree
The Rock God Nov 26, 2016 @ 9:19pm 
I have a 120Hz monitor, and can clearly notice when it drops below 90fps in first-person shooters.
cinedine Nov 26, 2016 @ 9:56pm 
It's perfectly normal. It'S a perception kind of thing and each person will experience it differently. I can't hear the slightest different between 192 kpbs MP3 and FLAC for example unless I really try and crank up the volume.
The most likely thing you'll notice it the game acting a bit more responsive, but that also depends highly on the kind of games you play. Fast paced action titles obviously profit more from it than turn based strategy.

For resolution it depends entirely on screen size and view distance. If you do a little google, you can find quite a lot of graphs when a higher resolution monitor is advantageous. But even then, perceived visual quality may differ from person to person and also by game. A game on low grpahics settings might still look like ♥♥♥♥ in 4k. ;)

(If you have an outlet store nearby, see if they have a 240 Hz monitor. There is one by Ezio for example. Ask for a demonstration on it. If you still can't see a difference to 40 Hz, there might be something seriously wrong with your eyes. :P While it still runs on 120 Hz, it will insert a black screen every other picture and it's amazing how smooth it looks. Our brain kinda only works on processing differences/movement and this black screen really gets to it.)

Originally posted by SquirrlyNuts:
snip
We have a saying here: if you don't know what you're talking about, just shut up. It's by far not the first post of yours I've seen which either contains patently false information or completely misses the point.
Last edited by cinedine; Nov 26, 2016 @ 10:00pm
Darren Nov 26, 2016 @ 11:40pm 
Perhaps it's best if I explain how it actually works.

Basically you create a series of textured triangles which make up the objects. GPUs can process X number of triangles per second. This number is restricted by:
- Memory Bandwidth from the CPU to the GPU (usage of this bandwidth is reduced by loading the textures into the GPU memory in advance)
- Processing of the lighting and shadows in the scene
- Post-Processing steps that are enabled (bump mapping, anti-aliasing, etc)
There is usually an important number that GPU manufacturers tout, particularly when it's higher than a competitors card.

Network architecture of a game has no impact on the GPU side of the equation (it can still do the same number of triangles regardless), there are other limitations to framerate you actually get however (which come down to when the game decides to tell the GPU that it has all the data necessary to render the current scene). A poorly designed game could tie that to the netcode (so it tries to update on every packet from the server) but that would be silly.

Any effectively programmed game does an estimate of what has probably happened, and renders that then if it gets corrected from the netcode adjusts the rendered scene. For small corrections interpolating the difference to make it smooth, for large differences (when you have a lot of lag for example) just causing people to jump to their "correct" location.
Last edited by Darren; Nov 26, 2016 @ 11:41pm
僕の名前 (仮) Nov 27, 2016 @ 12:41am 
mouse and keyboard lag
Originally posted by Darren:
Perhaps it's best if I explain how it actually works.

Basically you create a series of textured triangles which make up the objects. GPUs can process X number of triangles per second. This number is restricted by:
- Memory Bandwidth from the CPU to the GPU (usage of this bandwidth is reduced by loading the textures into the GPU memory in advance)
- Processing of the lighting and shadows in the scene
- Post-Processing steps that are enabled (bump mapping, anti-aliasing, etc)
There is usually an important number that GPU manufacturers tout, particularly when it's higher than a competitors card.

Network architecture of a game has no impact on the GPU side of the equation (it can still do the same number of triangles regardless), there are other limitations to framerate you actually get however (which come down to when the game decides to tell the GPU that it has all the data necessary to render the current scene). A poorly designed game could tie that to the netcode (so it tries to update on every packet from the server) but that would be silly.

Any effectively programmed game does an estimate of what has probably happened, and renders that then if it gets corrected from the netcode adjusts the rendered scene. For small corrections interpolating the difference to make it smooth, for large differences (when you have a lot of lag for example) just causing people to jump to their "correct" location.

Id like to point out that bump mapping is not a post process effect(But i suppose you could use some kind of deferred shading technique for it and it would be kind of post processed), also AA is not always post processed. The simplest kind of AA is just rendering to a bigger render target then the output window, when the render target is presented on swap to the output window it is automatically downscaled/upscaled to fit that windows resolution, its called MSAA. But fxaa and many other aliasing techniques are post processed. Also the triangles themselves usually do not have the actual texture but usually have UV coords, position, normal,texture channel and maybe something else like color or if you want to run well on AMD hardware you have the model data as a constant array and then each vertex has nothing but an index value.

It is technically possible(but impractical) to make your entire game run completly on the GPU and the CPU only sends data to it(Data that could be obtained from a network). By completly on the GPU I mean not only the drawing but the actual game processing and content is running fully on the GPU. All of your oher points where well made.
Azza ☠ Nov 27, 2016 @ 5:10pm 
The human eye doesn't see in FPS (Frames Per Second). That is entirely a myth.

However, at the same time, a health young standard human eye can perceive and detect drops below 48 FPS and even noticable changes even up to 120 FPS. Why?

Persistence of vision is the phenomenon of the eye by which an afterimage is thought to persist for approximately one twenty-fifth of a second on the retina.

It's the flickering effect which annoys the human eye, as the frame flips to the next. Mostly it's ignored by the human brain, cats and dogs for example would notice it more. Depending on how smooth the edges of the animation is, the human brain will still register the previous few frames with the one it sees, calculating differences and ignoring slight variations. This is why monitors now all come with backlights, it greatly reduces this flickering effect.

You'll find that movies and console games can run lower 24FPS and get away without being noticed, because of the distance and edge blur. However, a PC has much higher quality and is closer range, therefore the brain can pick out the edge change a lot more. It entirely depends on what animation you are viewing and what device your viewing it on. For a standard PC, it's ideal to keep it at least above 48FPS at all times, for younger eyes not to be so distracted by the changes.

The human mind can actually detect and correct to a certain degree. However, higher the resolution and sharper the image edges, the more frames are required to make it appear as a smooth movement - without the brain stressing to fill missing gaps.

FPS changes and varies, so 30 FPS won't be continuous (rather it's a rise and lower (for example: 24 to 48 FPS). It's thoses changes which are even more distracting at lower FPS levels. When getting up to 120FPS+, it becomes much less noticed.

You eye also adjusts and learns to accept what it sees. If you need glasses, but don't wear them for years, the eye will consider what it sees as normal... until you see better with glasses, then when you remove the glasses vision suddenly appears a lot more blurry. The same factor applies to monitors. People running at 60Hz, will be happy, till they see a 120Hz/144Hz monitor to compare it against. The brain will then register the 60Hz as lower quality, than what it first determined it to be at.
Last edited by Azza ☠; Nov 27, 2016 @ 5:13pm
i can't read small letters.

my old, cheap 1600x900 led monitor has 60hz. got no problem when having between 50-60 fps plating left 4 dead.
dropping to below 50 = headache for me.
Tux Nov 28, 2016 @ 2:06pm 
Originally posted by Toadflax:
When playing games, or watching movies, if my FPS drops to 15, it's obvious. If it drops to 30, it's just about (maybe) noticeable. Beyond 40 FPS, I can't see any difference. I've tried 144 Hz monitors, and they look no different from my 60 Hz.

For me, the same is true of 4K. Sitting 2 feet away from a 27 inch monitor, I can't see any difference between 4K, and 1920 x 1080.

I really do think the whole increase in refresh rates and screen resolution is a con job by the industry to force us to buy new gear. There are real limits to human perception, and we've gone beyond them.

I can.

but it also depends on how close I am to the screen in question on FPS. On resolution beyond 2K. I dont know but I ABSOLUTLY can tell the difference between 1080p and 2k
Last edited by Tux; Nov 28, 2016 @ 2:07pm
Zefar Nov 28, 2016 @ 10:51pm 
https://frames-per-second.appspot.com

You can easily tell the difference and when playing you can feel the difference. I don't buy the "I can't see a difference" part.
Just add more footballs to see the massive difference. If you have a higher refresh rate monitor you can see the 90 and 120 versions.
WarNerve Nov 21, 2021 @ 7:00pm 
Originally posted by The Rock God:
I have a 120Hz monitor, and can clearly notice when it drops below 90fps in first-person shooters.


No you can't. There is no way you can tell the difference between, say, 80 and 90 fps. Not gonna happen.
Dracoco OwO Nov 21, 2021 @ 7:01pm 
I can see frame differences and i know when i'm dropping frames immediatly.
< >
Showing 1-15 of 36 comments
Per page: 1530 50

All Discussions > Steam Forums > Off Topic > Topic Details
Date Posted: Nov 26, 2016 @ 5:37pm
Posts: 36