Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
What is the difference between refresh rate and frame rate? Shouldn't both be the same?
with vsync on, and not limited by cpu/gpu each refresh will be a new frame
if fps is lower than refresh, some refresh will show duplicated frame (vsync on) or frame parts (vsync off)
or gsync/freesync will lower refresh rate to match gpu output
if fps higher than refresh rate
fast vsync/old vsync will drop frames and not show some
new vsync will put the gpu to idle, and not start creating the frame until the newest created frame is being sent to the display
with vsync off, as soon as a frame is complete it begins sending info from that frame and you get tear lines
if fps is lower than refresh, 0 or 1 tear line depending on how low or where the last tear line was
if fps is higher, than its 1 or more tear line on every refresh
Frame rate is an entirely different metric as it's the number of frames rendered by the system per second. You only really see that metric used in regards to video game performance.
I'm probably starting to get comfy somewhere in the 50's really for an RPG and maybe even a locked 30 or 45 is acceptable but can tell the difference maybe up to around 100 when there is a lot of movement.
If a game is twitchy though so far the higher the better (up to 144, which is as fast as I've been hands on with) not so much that I can consciously tell much over 100 but because I notice less eye strain and I can play longer without getting a headache or feeling tired or even bored.
Some may always be way more sensitive than me while a few others could not tell the difference between a nice smooth 30 and a locked 144.
And it’s kinda OK for static camera.
Some 2D animated movies are done in 12fps.
In that context for such games 60 can be plenty.
Even going up to 75 FPS made a difference for me.
So yes, going higher will be better. But I'd probably cap it around 120 FPS. Past that the bonus is far too small compared to how much you'd have to demand your GPU to work.
The Simpsons Hit 'n Run's physics engine just breaks if you try to run it above 60 frames per second. In Touhou Project gameplay speed is tied to framerates. The intended play speed of the game is 60 F.P.S., so if you up the frame rate up to 120 F.P.S. you'll see the game moves twice as fast, making it harder to play, and at 1000 F.P.S. you'll have difficulty even just getting past Rumia.
Yeah, I know those aren't R.P.Gs. but the point is you have to be aware of how the games you want to play behave when the framerate is uncapped. If it plays poorly then going in excess is not only pointless but detrimental, so you'll find yourself applying a frame limiter anyway.
I'm not sure how your specific library of games will respond, but it's something to keep in mind, especially when playing older games. High refresh rate monitors weren't really a standard issue thing prior to 2013, which is when the 120hz Asus VG236H hit the market, and even then it takes some time for the market to adapt.
I mean yeah, you could trade off resolution fed into a C.R.T. to increase the refresh rate. The IIyama Vision Master Pro 512[www.tweaktown.com] can pull off 500hz if you're willing to play at 320p. However, that's besides the point since I doubt this was especially common practice though. I think most people were content with 60hz, with some targeting 75hz or 80hz if they were sensitive to flicker. Some hardcore competitive versus players might've targeted higher resolutions in order to gain a competitive edge over more casual players, but we're talking about single player here.
But also, 10+ year old games have softball system requirements so even a low cost system should be able to hit 120+ F.P.S. pretty easily.
Crank all settings to Ultra or Extra. If the game looks a bit laggy (don't look at the fps meter) then turn it down the graphics to high or medium.
And the game looks amazing as well as smooth; it's only human psychology which makes something look smooth or great in quality
No, it's literally the FPS counter that shows if the game will perform badly.
If you have a steady FPS rate that is capped you'll notice how smooth it is. But if the FPS jumps around from anywhere 20 FPS to 60 FPS you will be having a bad time.
By itself is also pointless because it won't tell you why your performance is bad, you need to see utilisation stats, frametimes, 1% and 0.1% lows, temperatures, etc.