Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
K
Minor correction - people playing at 144Hz without some form of adaptive sync, which should be pretty rare now that modern nVidia cards can use Freesync and a huge amount of the 144Hz panels out there are Freesync capable.
tearing is far less noticeable on higher refresh rates though, because of the faster scan. Same way you can't see tearing when you run at very high frame rates (400 fps +) on 60hz with nosync.
I have a 240hz monitor, and there is no need for any kind of sync. I never use freesync at all, even though I could do. I don't even mind playing in borderless with DWM plastered over the top of it, as long as I'm in 240hz mode, because there are far more matched multiples.
VRR brings its issues itself: like inverse ghosting at lower frame rates, due to high levels of overdrive. More noticeable perception of engine related stuttering, where it occurs. Adaptive sync cutting out when it drops under 48 fps with freesync 1 and reverting to some other regular kind of sync (not with gsync). Polling rate of mouse going down with refresh (very annoying in games with a mouse cursor)
Borderless (i.e. forcing the DWM) always syncs frame present to refresh, so yeah, you won't get much tearing if any.
Inverse ghosting is really ONLY a problem on Samsung VA panel - which are not the most common. Personally I still prefer the g2g response on TN for gaming anyway.
I've found that even deep engine stuttering (for a very good example, the 64hz stutter in unmodified Fallout NV) is lessened in perception with variable refresh, but yeah that one is purely subjective.
The Freesync 48Hz limit isn't really a hardware limit - it's where the specification recommended, but any decent monitor will usually go to 30Hz or so. And when paired with an nVidia card (don't have a current AMD to validate), the nVidia drivers will set the refresh to double the FPS at low framerates, i.e. if a game drops to 40FPS, the monitor will run at 80Hz, displaying every frame twice. Even on a 48Hz low-end monitor, you would need to drop down to 24FPS in order to run into those issues.
Polling rate is never tied into monitor refresh - it's tied into actual FPS in those cases, and unless you are also using some kind of vsync, those two are going to be uncoupled.
If you run a game capped at 60 fps in VRR, the refresh rate is 60hz. If you run capped at 60 fps in 240hz, if the game uses hardware mouse, you still see the mouse cursor moving at 240fps.
Anyone who had gone from 60hz to 120/144/240hz, the mouse cursor would be the first thing they noticed when stepping up to the higher refresh rate.
The lower polling of the mouse cursor on the screen is a very annoying feature of VRR, especially in RTS games and the like.
Give me a real 240hz scanout (even if I capped every single game at 60 fps) over some 75hz monitor with gsync or freesync, every time.
Hmm, I've never see that occur - granted, I don't play a ton of cursor-driven titles these days.
Any particular game you can think of to demonstrate that in? I'm a sucker for technical oddities :D
Edit: It strikes me that in that case, the input would still be registered at 16.7ms intervals, so you could be adding up to a 4 frame latency versus your refresh rate. I'd imagine some people would find that more annoying that the cursor actually matching the rate at which the game actually accepts inputs anyway.
People aren't really going to notice the input lag on clicking, but it's just annoying to effectively have variable polling rate on the mouse cursor. Of course you are right that technically the polling rate of the mouse is always 125hz or 500hz or whatever.
I don't play RTS type games but it's pretty annoying in the minecraft menu for example. It is also true that some games don't have hardware mouse support, so even if you have nosync @ 240hz, and your frame rate is only 40 fps, your mouse cursor runs at 40 fps in RAGE. In Crysis, your mouse cursor is always moving at your refresh rate, regardless of your frame rate.
Will you not acknowledge though that tearing is not as bad, the higher the refresh rate is? Anybody can verify this as self evident, even if they don't have high refresh monitor, by creating a custom resolution with a 30hz refresh rate. If you cap at 30 fps in 30hz mode, with vsync disabled, tearing is so bad that it looks like someone cut a piece of paper in half and offset it by an inch. Then if you run capped at 30 fps in 60hz mode with no vsync, the tearing is still pretty bad, but not as bad as 30hz. It gets better the higher the refresh rate is.
Now, when you step up to 120hz mode, you notice the tearing isn't so bad at all anymore. So much so, that you don't feel the need to use vsync that much. You also notice at 120hz, that 50 fps capped on 120hz with vsync disabled, the frame rate is more fluid than 50 fps capped on 60hz with vsync disabled. 59 fps on 60hz gives stutter every second. That doesn't happen at all with 59 fps on 240hz.
Then, when you step up again, to 240hz, I felt there is no need for sync. Maybe only with sidescrolling 2D platformers etc. Now I think about it, I do actually use freesync sometimes, to be honest, but only with retroarch > frame throttle > sync exact content frame rate
It's a subjective measurement - I'm sensitive to them so any tearing is too much tearing. If someone isn't, than the less persistent tears at 240Hz are a very excellent solution.
Yes, I suppose. Though I still disagree strongly that 100 fps tears badly on 240hz. btw, I am not saying I can't see tearing on 240hz, but I have to look pretty hard for it in most cases.
I wish there was an objective way of measuring it. And in a way, there is evidence:
So for example, I know from the on screen display that my monitor has a horizontal refresh rate of 275000hz (275khz). I also am pretty certain there is no sync of any kind on this refresh rate, but nobody ever complains of tearing involving vertical tear lines going left to right on the screen. That's because there is no sync needed, as the H refresh is so high that you don't see any tearing as the scan is so fast (67khz horizontal refresh for 60hz vertical refresh). Of course 67khz is still nearly 300 times higher refresh rate than my 240hz vertical refresh, but you get the idea.
OK, so do you have a game which will run constantly at 400 fps? Half life: source, or halo combat evolved for example. If you run this at very high frame rate, you should not be able to see any tearing, even on only 60hz, because the scan is so fast on the other end.
But it is also like this on the refresh side too. If you have, say for example 480hz or 1000hz refresh rate (I know they are in prototype stage atm because I read forums over at blurbusters) I am confident that not even you could see tearing on that refresh rate, no matter what the frame rate was.
400FPS versus 60Hz will tear like hell - frames will almost never sync to refresh. I've seen that visibly many, many times. Unless something (like the DWM) is syncing frames to presents, that tearing is horrendous. You can use demos like the Windmill or Pendulum to demonstrate those kinds of effects as well.
The difference, again, is that tears on a high refresh monitor aren't as persistent. If a frame tears at 240Hz, it's only visible for 4.16ms - 25% the time as a frame at 60Hz, and beyond what the average human can easily notice. Some simply won't see it at all, others won't find it bothersome, and others may not actually "see" it, but it causes a vague motion sickness feeling.
You're probably right though, once you hit above 300-400Hz, you won't see the tears unless the frame consistently tears in the same horizontal position for multiple frames. Very few people can pick out single frames at that speed, and it would be effectively invisible.
It doesn't need to sync to the refresh though. The scan is so fast from the GPU, that there's never a big enough gap between the last frame to create an offset tear line effect.
Very high frame rate and very low refresh, are like very high refresh and very low frame rate. They both achieve the same effect of eliminating tearing, it's just they do it from the opposite end. So 30hz 600fps, would look visibly very similar to 600hz 30 fps.
Let's say you cap a game at 400FPS and output with no sync onto 60Hz panel.
There is going to be a tear 2/3s down the screen. It's unavoidable because the GPU is pushing 6 and 2/3 frames for each refresh on the monitor. That 2/3 frame is always going to be visible above the 1/3 previous frame.
Now, if you uncap the FPS, and if the FPS is inconsistent enough, that tear is going move around the screen rapidly. Since FPS is seldom THAT inconsistent, you will usually get persistent tears that tend to move around a close proximity.
The higher your refresh rate though, the more likely it is that the FPS will be an even divisor of, or within close range of a divisor of the monitor refresh. It's a sheer numbers game. Tears will move around more, stay for fewer refresh cycles, and due to pixel persistence often vanish into the random noise.
I am pretty sure you are wrong here.
One refresh cycle takes ~16.6ms in 60Hz mode. 400fps locked means that 1 frame will be ready in 2.5ms.
It doesn't mean that there will be a tear 2/3s down the screen though... It means that there will be multiple tears across the screen with 2.5ms intervals (when no sync is being used the monitor will show you whatever frame is currently ready to be shown). Your scan line goes from top left corner to bottom right corner. Let's assume the first rendered frame appears on your screen exactly when the new refresh cycle begins. In that case you will see ~1/6th of the frame in the upper part of the screen then in 2.5ms (below the first rendered frame) you will see ~1/6th of another frame in another 2.5ms another piece of yet another frame etc. So in the end there will be 1000/60/2.5 (~6.6) different frames shown on the screen at once at any given time.
While higher framerate obviously doesn't eliminate nor reduce screen tearing, for many people multiple tears look/feel less striking than a single tear line at low framerates (eg 30fps).