Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Therefore the in-game timer is the most accurate in terms of comparing speed runs since it counts only actual frames of input, and can't be manipulated by making the game run faster or slower than real time. I never expected it to be perfectly accurate to actual real time though, due to things like the slow-mo effect I described above. It just needs to be accurate when compared to other playthroughs in terms of the actions performed.
The game logic is supposed to run at 40 fps, or 25ms per frame, and it only stores the number of actual frames that have passed (again excluding things like loading screens), but multiplies it out to get the time it displays. That's why you'll notice certain times literally can't appear on the in-game timer, because it can only show multiples of 25ms. The game's render rate is based on your monitor's refresh rate (for vsync), so usually the game runs at 40 fps internally but renders at 60 fps.
It would appear in this case that the game is running slightly faster than 25ms per logic/physics frame. The way the game times itself is still based on the render rate, so it would make sense that changing the renderer could affect it. The game engine asks the computer how many frames per second the refresh rate is. It then compares that to how fast it is supposed to run (40fps) and calculates how many logic frames it should run for every render frame. So, in the typical example of a 60Hz monitor refresh rate, the game would run 1 logic update every 1.5 screen draws, and assume that the render rate is 60fps by enabling vsync.
If the game isn't getting accurate information back from OpenGL about the refresh rate, than it might run faster or slower than intended. Unfortunately the function that requests the monitor refresh rate from OpenGL/DirectX isn't always accurate. The game has a failsafe where if it notices that it is running significantly faster or slower than expected it tries to time the refresh rate itself by counting the milliseconds that pass between each render frame, but it isn't guaranteed to be accurate either and sometimes it doesn't realize that it needs to be timed manually in the first place.
So, yeah, in your case the game appears to run faster than intended on OpenGL. I don't know if this is across the board or just your hardware. You may want to check your Windows display settings and see if your monitor is set to an usual refresh rate and try changing it to 60. This could affect your personal timing for executing actions, and would make any speed runs based on "real time" invalid since it would make your run appear to be faster by virtue of the game running faster, so you should only use the in-game timer which counts actual logic/physics frames regardless of how fast or slow the game might be running in real time.
To be honest i never paid attention than the timer was slowing down with collisions and slow-mo kills, i will have learn something.