Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
If you have an Nvidia card enable fast sync in the Nvidia control panel and disable v-sync in game to get higher framerate and eliminate screen tearing.
Note: Some games such as Skyrim need vsync to be on which can be enabled on a per game basis.
Read:
https://www.gpumag.com/what-is-vsync/
E.g. Your monitor is 60 HZ so Vsnyc locks your max fps to 60 too.
It prevents your pc from overheating.
also , with a frame rate limiting options like you can now find in your geforce control panel if you have a geforce gpu you can actually set the FPS limit lower then your own monitor FPS if you know your game won't always run as fast then what the display can refresh itself.
For coop games only and singleplayer, yeah sure ,enable It if you want to
It doesn't stop your PC overheating, as even with 60FPS, you can still have 80+% load.
If you want a real FPS limiter use RTSS, NVCP, or AMDs driver frame limiter. These all work the same way, are brilliant limiters and will greatly improve smoothness. They should be used along side Vsync.
No frame rate limiter will prevent tearing.
Tearing is a lack of sync between monitor and GPU outputs.
There are options you can use that better improve the experience of no Vsync, like RTSS's scanline sync, it uses a proper FPS limiter, and allows you to control the tear lines location on the monitor. This can be used with the normal Frame limiter on the program to achieve better smoothness at lower framerates.
Though, will still experience issues due to low FPS, improper frame pacing (micro stuttering), and a tear line. The experience would be better than no intervention though.
Depending on monitors features, you can enable it and forget. If you have a monitor with G-sync or Freesync (G-sync Compatible, VESA adaptive sync), then you can turn Vsync on, and have no latency increase (no one would be able to feel the difference, not even a 'super epic CS pro with a million hours'), this would allow you to have a tear free experience across the board (contrary to popular belief, G-sync doesn't remove tearing, it only prevents a lot of it, it's main reason is to reduce latency and micro stutter -- when used properly, with a proper FPS limiter.)
Completely depends on your current setup.
What monitor do you have? (Does it support G-/Freesync?)
What GPU do you have? (Same as above.)
Do you have it (G-/Freesync) enabled?
Are you using an FPS limiter (like the above 3 I mentioned?)
Do you hate tearing, or notice large amounts of it?
If the answer is yes to all of them, then you should enable VSync.
If you are an FPS junkie, or need it for certain applications such as framerate requirements for certain games, like in IDtech engine games, certain FPS ranges improve the experience of the game (Run faster, shoot faster, jump sooner higher, reload sooner, heal sooner and faster. Etc.*)
In those cases, I would opt away from using Vsync just because of the competitive gains.
But that said, it depends on how competitive you are as a person.
As a general rule of thumb though, for myself, more of habit, I disable Vsync. I personally never found a use for it, not until I researched the topic more, sometimes rarely I will enable it, but with G-sync I experience so little tearing, it's a non-issue for me. And I have a (proper) FPS limiter in place, so I don't rely on it to not exceed the G-sync threshold (1-2 FPS below your refresh rate.)
The only time I avoid this FPS limit is in certain shooter games, CSGO, CoD Bo2, and if I'm feeling it, Quake, since I can easily get above 165 FPS in the former two, and 1,000 FPS in the latter.
Though one could make the 'draws more power' argument, which is fair. Electricity costs money.
Here's a hypothesis.
If you leave your computer monitor on, and it's rendering something at a 'locked' frame rate, versus.
When it's not, otherwise said: versus when it's rendering on full load.
I just thought that the computer well, one of its components would burn out earlier than say when it's locked or Vsync enabled. This could be even your monitor.
On the other hand I totally see what you mean, kind of and like how you added that argument of how it could cost more money. Not that it's like changing subjects but it's also a valid hypothesis or remark to add to the comparison but....
From a technical standpoint things that burn twice as bright burn twice as fast, no?
This was my point. I mean, you could argue against it but what's the use. It's gonna burn out eventually, right? Whatever it is?
You aren't going to see hardware break from being used at 75% compared to 50%, every piece of hardware inside a GPU (or most other parts) is overrated compared to what it needs to do, it can handle more than what you will use it for. Now there are some exceptions (low end motherboards), however all GPUs need to be built to a manufacturer specification, they need to meet a certain requirement, this is going to be more than a person would need.
There is also lots of smart sensors for preventing damage in hardware these days.
The main cause of dying components, provided they aren't caked in dust their entire life, is thermal cycling.
The die of the CPU/GPU heating up and cooling down, heating up, cooling down. It causes cracks so tiny and minor it takes years, decades even to manifest.
There are faster ways to kill a part, but if you keep it dust free, don't throw it around, and somewhat free of static, and it will last 10+ years easy. By which point it will be obsolete.
Monitors don't really go bad either a decade before any issue is apparent, if you don't mistreat it.
It does somewhat depend on monitor but tearing effects all monitors, the FPS ranges you're at, and how well 'adapted' you are to it.
Anecdotal, but before I got my 144hz monitor I couldn't see any tearing, I thought it wasn't an issue, after I got my 144hz, never saw it either. But a short while after, I got that 60hz monitor out, used it for a while, and it had so much tearing, it was crazy, I looked at my settings, everything was the same, and it just was awful.
I can see tearing on my 144hz monitor in some cases too.
The idea of where before hardware used to die, now it just seems more durable for some reason and if it's developed that way I can like see how or why that's the case. It just works.
Great. (thanks for the comments Autumn)
I also really mean SHOULD not that it's will.