Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It matters because of how the games use frames and how your monitor displays them. If you have 200 frames, and you monitor shows 100 frames per second, you're twice as likely as having the most up to date frame in correlation with what's actually happening in the game.
Google a video about it.
you have a valid point.
Sorry, I completely forgot to mention the tick-rate of a game. Modern games have a tick-rate; in other words, how many calculations it does per second. The average (included Unreal's -- DbD's engines) is, you guessed it, 60 ticks per second. This means that even if the graphics card renders at 120 FPS, only about half of those would actually have a change in what's actually seen. I say average because there is a small latency between the CPU and GPU (unless you have a very low end computer that uses the CPU to render graphics).
So yes, the recommended FPS of a game should be the average between a game's tick-rate (usually 60 TPS) and your monitor (between 24 Hz and 144 Hz -- but a majority being 60 Hz), meaning that the recommended FPS of any game should be about 60 FPS.
Now maybe in the future this would change to be higher, I don't know. But right now, 60 FPS is all you actually need. No need to make a huge deal about making it higher.
While yes, sometimes you'll have images on the screen render twice due to the frame latency; worst case scenerio, it would cut the FPS you percieve in half. Again, that's worst case scenerio. So yes, you will be fine allowing your graphics card to render more than what your monitor's refresh rate allows, and in a way, I recommend it if you want to max out the efficiency of your monitor; however, the higher you get, the more likely you are wasting frames because you'll still eventually get to a point where your graphics card will be rendering frames that your monitor wouldn't actually put out. In the end, once you pass your monitor's refresh rate, you'll be trading your graphics card's efficiency with your monitor's efficiency; your monitor's refresh rate is the average "middle ground" which is why I only referenced it in my original post.
(end of post