737382828299338 22 mai. 2018 às 21:09
how do i know if g-sync is working?
just got my new g-sync monitor and i launched skyrim and first thing i noticed is my fps wasnt capped. I checked my NVIDIA control panel and saw that g-sync had a check mark and was set to work in fullscreen applications or something like that. I didnt see anything happen in the game that looked anymore smooth than my old freesync 144hz monitor. I was wondering if i would notice if g-sync was working or not because for the amount of money i spent on a g-sync monitor im so far not so satisfied as i am happy with my 1440p and 27 inch monitor but i havent noticed much improvement with how smooth games run with the g-sync feature. Not even in pubg. Again, would i notice a difference or am i doing something wrong and is it most likely just not working and am i missing something?(and yes I am using a display port)
Última alteração por 737382828299338; 22 mai. 2018 às 21:17
< >
A mostrar 1-15 de 19 comentários
Revelene 22 mai. 2018 às 21:48 
Frame rates don't get capped with gsync. Refresh rate will be sync'd with frame rates, allowing for variable refresh rate.

Some monitors have a light indicator, some don't. Some have a refresh rate counter you can turn on in the monitor OSD. If it is working, your monitor's refresh rate will be fluctating with the same value as the frame rate.

Do note that Gsync only works within range of the max refresh rate for your monitor. After frame rate goes over your max refresh rate, you can either let it go unhindered, or enable vsync (or alternative sync).

Also, on some monitors, you have to enable Gsync in the monitor OSD as well.
737382828299338 22 mai. 2018 às 21:55 
Originalmente postado por Revelene:
Frame rates don't get capped with gsync. Refresh rate will be sync'd with frame rates, allowing for variable refresh rate.

Some monitors have a light indicator, some don't. Some have a refresh rate counter you can turn on in the monitor OSD. If it is working, your monitor's refresh rate will be fluctating with the same value as the frame rate.

Do note that Gsync only works within range of the max refresh rate for your monitor. After frame rate goes over your max refresh rate, you can either let it go unhindered, or enable vsync (or alternative sync).

Also, on some monitors, you have to enable Gsync in the monitor OSD as well.
I see. I have a 165hz but i have it set to 144hz atm but my frame rates were at the 200’s at some points. I think I might have been Fullscreen (windowed) because I looked full screen but my frames were over the maximum hz of my monitor. I’m just gonna change the g-sync setting from g-sync working in fullscreen to g-sync working in fullscreen and windowed mode and see if that fixes it. This may explain why I didn’t notice much inprovement
Andrius227 23 mai. 2018 às 1:58 
Also, i dont think you would notice a difference between freesync and g sync. I never had freesync, but i heard its really similar.
Arya 23 mai. 2018 às 2:56 
Most G-Sync monitors have an indicator light that shows whether the G-Sync system is working. The G-Sync pod itself produces quite a bit of heat, if you rest your hand on the back of the monitor you should be able to feel it.
Última alteração por Arya; 23 mai. 2018 às 2:57
Revelene 23 mai. 2018 às 11:17 
Originalmente postado por ETAWN:
Originalmente postado por Revelene:
Frame rates don't get capped with gsync. Refresh rate will be sync'd with frame rates, allowing for variable refresh rate.

Some monitors have a light indicator, some don't. Some have a refresh rate counter you can turn on in the monitor OSD. If it is working, your monitor's refresh rate will be fluctating with the same value as the frame rate.

Do note that Gsync only works within range of the max refresh rate for your monitor. After frame rate goes over your max refresh rate, you can either let it go unhindered, or enable vsync (or alternative sync).

Also, on some monitors, you have to enable Gsync in the monitor OSD as well.
I see. I have a 165hz but i have it set to 144hz atm but my frame rates were at the 200’s at some points. I think I might have been Fullscreen (windowed) because I looked full screen but my frames were over the maximum hz of my monitor. I’m just gonna change the g-sync setting from g-sync working in fullscreen to g-sync working in fullscreen and windowed mode and see if that fixes it. This may explain why I didn’t notice much inprovement

Why have you not set it to 165hz? Go ahead and set that. It has no impact on the panel, as it is designed to run at 165hz.

If your frames are in the 200s, then it is out of range and Gsync will not work. For situations like this, you can use vsync or an alternative, like fast sync.

Originalmente postado por Andrius227:
Also, i dont think you would notice a difference between freesync and g sync. I never had freesync, but i heard its really similar.

The way they handle it is different, with Gsync being best as it is handled with a dedicated chipset. Freesync is also prone to many issues and can impact performance, since it is handled as an extra load on user end hardware. However, when working properly, both are near identical in performance, as far as eliminating screen tear and amount of added latency.
John Doe 23 mai. 2018 às 12:01 
Originalmente postado por Revelene:
The way they handle it is different, with Gsync being best as it is handled with a dedicated chipset. Freesync is also prone to many issues and can impact performance, since it is handled as an extra load on user end hardware. However, when working properly, both are near identical in performance, as far as eliminating screen tear and amount of added latency.

What Andrius said is right, and I've never heard a single issue with Freesync in my life. You're spreading nonsense. HardOcp did a test with Asus Freesync and Asus G-Sync monitors consisting of 10 gamers, and it was about 5 to 5 in a blind run. Unless there is a test showing the G-Sync module's superiority over Freesync, what you said is simply not true.

There is also another limitation of G-Sync not supporting HDR concurrently. FreeSync 2 does this and Kyle and the crew actually prefer it to G-Sync. I believe G-Sync limits 10-bit panels to 8-bit too. Pretty stupid on Nvidia's end.

There is no point in moving from a FreeSync monitor to a G-Sync unless the latter monitor is superior OUTSIDE of sync-technology based factors.
Revelene 23 mai. 2018 às 12:15 
Originalmente postado por John Doe:
Originalmente postado por Revelene:
The way they handle it is different, with Gsync being best as it is handled with a dedicated chipset. Freesync is also prone to many issues and can impact performance, since it is handled as an extra load on user end hardware. However, when working properly, both are near identical in performance, as far as eliminating screen tear and amount of added latency.

What Andrius said is right, and I've never heard a single issue with Freesync in my life. You're spreading nonsense. HardOcp did a test with Asus Freesync and Asus G-Sync monitors consisting of 10 gamers, and it was about 5 to 5 in a blind run. Unless there is a test showing the G-Sync module's superiority over Freesync, what you said is simply not true.

There is also another limitation of G-Sync not supporting HDR concurrently. FreeSync 2 does this and Kyle and the crew actually prefer it to G-Sync. I believe G-Sync limits 10-bit panels to 8-bit too. Pretty stupid on Nvidia's end.

There is no point in moving from a FreeSync monitor to a G-Sync unless the latter monitor is superior OUTSIDE of sync-technology based factors.

Your experience is ancedotal. However, I did not mean that Freesync always had issues. I was merely stating that the tech behind it is inferior to a dedicated chipset. The load is off of your PC. This eliminates many problematic factors, especially for a PC that already has a lot of its resources used.

I also already stated that end user performance is near identical, did you miss that part?

Gsync 2.0 will support HDR, just like how Freesync 2.0 does. That is a non-issue at this point, considering the market doesn't exactly have HDR readily available for the masses just yet.

I also never stated that one should move from a Freesync monitor to a Gsync monitor. In fact, I believe one should only be worried which to use, based on the GPU they have. I don't believe Gsync vs Freesync should even be a factor when choosing a gpu, either.

You are assuming, possibly using me as a surrogate for an argument you had with someone else, just because I stated the technology is better. It may be hard to swallow, but it is, technologically speaking. However, like I said before, the end user experience is near identical, at least as far as eliminating tearing goes.

With Gsync, input latency is usually lower than Freesync. This is due to the fact that Gsync is a dedicated chipset and is tuned as such. Freesync relies on the monitor implementation, so your results may vary. Even some features of Freesync depend on the monitor, with some features being only available with higher end Freesync monitors.

Perhaps you should give this a read: https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

This video shows you the differences in features, too: https://youtu.be/oVheiHBWYrE
Última alteração por Revelene; 23 mai. 2018 às 12:24
John Doe 23 mai. 2018 às 12:28 
Originalmente postado por Revelene:
Originalmente postado por John Doe:

What Andrius said is right, and I've never heard a single issue with Freesync in my life. You're spreading nonsense. HardOcp did a test with Asus Freesync and Asus G-Sync monitors consisting of 10 gamers, and it was about 5 to 5 in a blind run. Unless there is a test showing the G-Sync module's superiority over Freesync, what you said is simply not true.

There is also another limitation of G-Sync not supporting HDR concurrently. FreeSync 2 does this and Kyle and the crew actually prefer it to G-Sync. I believe G-Sync limits 10-bit panels to 8-bit too. Pretty stupid on Nvidia's end.

There is no point in moving from a FreeSync monitor to a G-Sync unless the latter monitor is superior OUTSIDE of sync-technology based factors.

Your experience is ancedotal. However, I did not mean that Freesync always had issues. I was merely stating that the tech behind it is inferior to a dedicated chipset. The load is off of your PC. This eliminates many problematic factors, especially for a PC that already has a lot of its resources used.

I also already stated that end user performance is near identical, did you miss that part?

Gsync 2.0 will support HDR, just like how Freesync 2.0 does. That is a non-issue at this point, considering the market doesn't exactly have HDR readily available for the masses just yet.

I also never stated that one should move from a Freesync monitor to a Gsync monitor. In fact, I believe one should only be worried which to use, based on the GPU they have. I don't believe Gsync vs Freesync should even be a factor when choosing a gpu, either.

You are assuming all of this, just because I stated the technology is better. It may be hard to swallow, but it is, technologically speaking. However, like I said before, the end user experience is near identical.

With Gsync, input latency is usually lower than Freesync. This is due to the fact that Gsync is a dedicated chipset and is tuned as such. Freesync relies on the monitor implementation, so your results may vary.

Perhaps you should give this a read: https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

I think you should take your own advice and do some reading on these things. You tried to point out that Freesync is "prone to many issues", I have never read a single issue with it. That is just something completely theoterical you made up because the monitors don't carry a little chip. Please tell me some things about these PCs that are "already being a lot of their resources used", because FreeSync does NOT create a significant overhead. Period.

Currently, FreeSync supports HDR, not G-Sync. So it is what it is. As for HDR not being available to the masses, you're wrong. Samsung already sells good HDR panels around the $400 price level.

I didn't say you stated that one should move from FreeSync to G-Sync. That statement was directed at the OP. I do however believe it should be a factor when picking up a GPU.

I'm not assuming anything, I also said G-Sync prevents 10-bit from working. This is nonsensical and to my knowledge does not happen with AMD Freesync. You want 10-bit with a 10-bit monitor for a higher color range and less banding. If you're like me and idle more than game, you want 10-bit to work for higher IQ, not HW based frame-sync because that is almost useless on non-gaming usage. Heck, some games even render in 10-bit.

As for the input latency difference, that's miniscule so it's not something to be bothered about.

Back more on topic, the OP can see whether G-Sync is working or not on 3DMark. It tells that at the end of the test.
Revelene 23 mai. 2018 às 13:01 
Originalmente postado por John Doe:
Originalmente postado por Revelene:

Your experience is ancedotal. However, I did not mean that Freesync always had issues. I was merely stating that the tech behind it is inferior to a dedicated chipset. The load is off of your PC. This eliminates many problematic factors, especially for a PC that already has a lot of its resources used.

I also already stated that end user performance is near identical, did you miss that part?

Gsync 2.0 will support HDR, just like how Freesync 2.0 does. That is a non-issue at this point, considering the market doesn't exactly have HDR readily available for the masses just yet.

I also never stated that one should move from a Freesync monitor to a Gsync monitor. In fact, I believe one should only be worried which to use, based on the GPU they have. I don't believe Gsync vs Freesync should even be a factor when choosing a gpu, either.

You are assuming all of this, just because I stated the technology is better. It may be hard to swallow, but it is, technologically speaking. However, like I said before, the end user experience is near identical.

With Gsync, input latency is usually lower than Freesync. This is due to the fact that Gsync is a dedicated chipset and is tuned as such. Freesync relies on the monitor implementation, so your results may vary.

Perhaps you should give this a read: https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

I think you should take your own advice and do some reading on these things. You tried to point out that Freesync is "prone to many issues", I have never read a single issue with it. That is just something completely theoterical you made up because the monitors don't carry a little chip. Please tell me some things about these PCs that are "already being a lot of their resources used", because FreeSync does NOT create a significant overhead. Period.

Currently, FreeSync supports HDR, not G-Sync. So it is what it is. As for HDR not being available to the masses, you're wrong. Samsung already sells good HDR panels around the $400 price level.

I didn't say you stated that one should move from FreeSync to G-Sync. That statement was directed at the OP. I do however believe it should be a factor when picking up a GPU.

I'm not assuming anything, I also said G-Sync prevents 10-bit from working. This is nonsensical and to my knowledge does not happen with AMD Freesync. You want 10-bit with a 10-bit monitor for a higher color range and less banding. If you're like me and idle more than game, you want 10-bit to work for higher IQ, not HW based frame-sync because that is almost useless on non-gaming usage. Heck, some games even render in 10-bit.

As for the input latency difference, that's miniscule so it's not something to be bothered about.

Back more on topic, the OP can see whether G-Sync is working or not on 3DMark. It tells that at the end of the test.

Oh, really? Never heard of issues with Freesync? Perhaps you should Google "Issues with Freesync" and also read into what ancedotal means.

The point about the load on the user end was not about overhead, but rather than it adds an extra layer of failure and probability. If you don't believe that there are issues with it, most common issues being flicker and tearing, then go Google it yourself. Many of the issues are due to using the cheaper monitors. The higher end ones work better, and also have more of the Freesync feature set.

Gsync "2.0" is already completed. We are just waiting on monitors to actually support HDR right now. Not exactly for the masses right now, as the choice for HDR capable monitors is very slim. That is going to change soon, though, as well as Gsync support for HDR.

Freesync monitors around the $400 price point do not have all the features of Freesync, either.

Input latency can be noticable, depending on the user. There are plenty of people that are more sensitive to input latency than others. Again, your experience is ancedotal.

And yet again, your "idle experience" is merely ancedotal.

You are completely missing my point. I was merely being informational, as the Gsync tech is more involved, and consistant, than Freesync. I never once said that Freesync was bad, nor did I say that anyone should jump ship to Gsync. You added that into the discussion, even though that was never what it was about.
Última alteração por Revelene; 23 mai. 2018 às 13:04
John Doe 23 mai. 2018 às 13:12 
Originalmente postado por Revelene:
Oh, really? Never heard of issues with Freesync? Perhaps you should Google "Issues with Freesync" and also read into what ancedotal means.

Many of the issues are due to using the cheaper monitors. The higher end ones work better, and have more of the Freesync feature set.

Gsync "2.0" is already completed. We are just waiting on monitors to actually support HDR right now. Not exactly for the masses right now, as the choice for HDR capable monitors is very slim. That is going to change soon, though, as well as Gsync support for HDR.

Freesync monitors around the $400 price point do not have all the features of Freesync, either.

Input latency can be noticable, depending on the user. There are plenty of people that are more sensitive to input latency than others. Again, your experience is ancedotal.

And yet again, your "idle experience" is merely ancedotal.

You are completely missing my point. I was merely being informational, as the Gsync tech is more involved, and consistant, than Freesync. I never once said that Freesync was bad, nor did I say that anyone should jump ship to Gsync. You added that into the discussion, even though that was never what it was about.

Yes, really. Googling "issues with Freesync" is about as relevant as Googling "issues with my back" or "issues with my teeth". This is the most nonsensical statement you made so far and is hilarious. You can put "issues" next to anything and Google it to read problems with stuff. That means nothing, null, zero.

If you use "cheaper monitors", you get TN panels and nothing else with G-Sync, and if you think a TN panel is better than a VA equivalent that supports Freesync around the same price tag, that would be just as much of a laughing stock material as your Googling statement.

FALD is expensive and not getting cheap anytime soon. You can tell this by looking at the prices of the 32 inch Asus panel and the 27 inch Predator X27. So your hope for "1000 nit HDR" is not coming "soon". You have to stick with the panels with lesser brightness for now.

Input latency depends from one monitor to another, your statements are irrelevant as they do not base things on anything but "this can be that, that can be this" while mine are actually factual, based on things like 10-bit monitors working at 8-bit for no actual reason.

At the end of the day, FreeSync vs G-Sync comes down on needs and personal choice, so no, one is not really better than the other by a significant shot. Yes, G-Sync is likely to perform A LITTLE better because of the module, but that is compensated by the other things I mentioned, such as the ability to run 10-bit.

OP, Blurbusters is a good site for checking your monitor's sync performance.
Revelene 23 mai. 2018 às 13:36 
Originalmente postado por John Doe:
Originalmente postado por Revelene:
Oh, really? Never heard of issues with Freesync? Perhaps you should Google "Issues with Freesync" and also read into what ancedotal means.

Many of the issues are due to using the cheaper monitors. The higher end ones work better, and have more of the Freesync feature set.

Gsync "2.0" is already completed. We are just waiting on monitors to actually support HDR right now. Not exactly for the masses right now, as the choice for HDR capable monitors is very slim. That is going to change soon, though, as well as Gsync support for HDR.

Freesync monitors around the $400 price point do not have all the features of Freesync, either.

Input latency can be noticable, depending on the user. There are plenty of people that are more sensitive to input latency than others. Again, your experience is ancedotal.

And yet again, your "idle experience" is merely ancedotal.

You are completely missing my point. I was merely being informational, as the Gsync tech is more involved, and consistant, than Freesync. I never once said that Freesync was bad, nor did I say that anyone should jump ship to Gsync. You added that into the discussion, even though that was never what it was about.

Yes, really. Googling "issues with Freesync" is about as relevant as Googling "issues with my back" or "issues with my teeth". This is the most nonsensical statement you made so far and is hilarious. You can put "issues" next to anything and Google it to read problems with stuff. That means nothing, null, zero.

If you use "cheaper monitors", you get TN panels and nothing else with G-Sync, and if you think a TN panel is better than a VA equivalent that supports Freesync around the same price tag, that would be just as much of a laughing stock material as your Googling statement.

FALD is expensive and not getting cheap anytime soon. You can tell this by looking at the prices of the 32 inch Asus panel and the 27 inch Predator X27. So your hope for "1000 nit HDR" is not coming "soon". You have to stick with the panels with lesser brightness for now.

Input latency depends from one monitor to another, your statements are irrelevant as they do not base things on anything but "this can be that, that can be this" while mine are actually factual, based on things like 10-bit monitors working at 8-bit for no actual reason.

At the end of the day, FreeSync vs G-Sync comes down on needs and personal choice, so no, one is not really better than the other by a significant shot. Yes, G-Sync is likely to perform A LITTLE better because of the module, but that is compensated by the other things I mentioned, such as the ability to run 10-bit.

OP, Blurbusters is a good site for checking your monitor's sync performance.

These issues are relevent, but you seem to be rather defensive for some reason. Freesync, especially with cheaper monitors, are prone to having issues with Freesync "working" but still having tearing. Flickering is another issue. This is a known in the AMD community.

As far as price point goes, yeah, Freesync is a better deal. A lot cheaper. And no, I don't believe that. You are fishing for something to belittle me.

Gsync is more expensive than Freesync. I do not contest to that. However, Gsync "2.0" is indeed coming soon, when more HDR capable monitors are readily available for the masses.

I was talking about input latency added from Gsync and Freesync. Gsync always has the same added latency from using Gsync, as it is the same chipset. However, with Freesync, it depends heavily on the panel how much latency is added due to using Freesync. I was not talking about the latency of the monitor, but rather the latency of the implementation of Gsync and Freesync.

And it comes back to what I said in the initial comment... in the end of the day, they are more or less the same. I never said one was better in any significance, but rather talked about the technology. There is a difference. Clearly you are upset over something, and I do not know what, but I'm NOT saying that anyone should be switching or anything of the sort.

I see things from a technological value, not ancedotal evidence. Gsync, being a dedicated chipset, has potential for more and already has more in most categories. Sure, current Gsync is limited to 8-bit, but that is pointless for most gamers. HDR is still in infancy for desktop monitors and it will be for a little while more.

But again, for clarification... I am NOT saying that one should switch for any reason. Merly here for informative reasons. It is always good to know your tech.

For the record, I have a couple different Gsync and Freesync monitors. I like them all. Pointless to try to get me to like Freesync... as I already like it. lol
Última alteração por Revelene; 23 mai. 2018 às 13:40
John Doe 23 mai. 2018 às 13:51 
Originalmente postado por Revelene:
These issues are relevent, but you seem to be rather defensive for some reason. Freesync, especially with cheaper monitors, are prone to having issues with Freesync "working" but still having tearing. Flickering is another issue. This is a known in the AMD community.

As far as price point goes, yeah, Freesync is a better deal. A lot cheaper. And no, I don't believe that. You are fishing for something to belittle me.

Gsync is more expensive than Freesync. I do not contest to that. However, Gsync "2.0" is indeed coming soon, when more HDR capable monitors are readily available for the masses.

I was talking about input latency added from Gsync and Freesync. Gsync always has the same added latency from using Gsync, as it is the same chipset. However, with Freesync, it depends heavily on the panel how much latency is added due to using Freesync. I was not talking about the latency of the monitor, but rather the latency of the implementation of Gsync and Freesync.

And it comes back to what I said in the initial comment... in the end of the day, they are more or less the same. I never said one was better in any significance, but rather talked about the technology. There is a difference. Clearly you are upset over something, and I do not know what, but I'm NOT saying that anyone should be switching or anything of the sort.

I see things from a technological value, not ancedotal evidence. Gsync, being a dedicated chipset, has potential for more and already has more in most categories. Sure, current Gsync is limited to 8-bit, but that is pointless for most gamers. HDR is still in infancy for desktop monitors and it will be for a little while more.

But again, for clarification... I am NOT saying that one should switch for any reason. Merly here for informative reasons. It is always good to know your tech.

No, they're not. If you buy cheap, expect cheap things to happen. You should be buying a certain level of quality products even if you have a limited budget, which would lead you to the smaller LG IPS or the Samsung monitors.

I'm not fishing for something to belittle you, I'm simply stating out the obvious of needing to at least buy a $600 monitor to get an IPS G-sync panel. If you already know this, then why do you insist on pointing out other things?

Most the good Freesync monitors don't have A LOT of latency added from Freesync. For sober fact, the Samsung monitors come with very little latency as their input lag is low and GTG is 1ms while being VA.

Yes, there is a difference in the technology. You pay a large premium for Nvidia's module, while not getting your entire money back in return when it comes to at least the $250-300 range of monitors.

"Gaming" is not everything and HDR is already something if you watch movies on your monitor. 10-bit shines in everyday Windows desktop usage, as I can notice the difference on my monitor between 8-bit and 10 on Corsair Link and Steam windows. The colors are better.
Revelene 23 mai. 2018 às 14:25 
Originalmente postado por John Doe:
Originalmente postado por Revelene:
These issues are relevent, but you seem to be rather defensive for some reason. Freesync, especially with cheaper monitors, are prone to having issues with Freesync "working" but still having tearing. Flickering is another issue. This is a known in the AMD community.

As far as price point goes, yeah, Freesync is a better deal. A lot cheaper. And no, I don't believe that. You are fishing for something to belittle me.

Gsync is more expensive than Freesync. I do not contest to that. However, Gsync "2.0" is indeed coming soon, when more HDR capable monitors are readily available for the masses.

I was talking about input latency added from Gsync and Freesync. Gsync always has the same added latency from using Gsync, as it is the same chipset. However, with Freesync, it depends heavily on the panel how much latency is added due to using Freesync. I was not talking about the latency of the monitor, but rather the latency of the implementation of Gsync and Freesync.

And it comes back to what I said in the initial comment... in the end of the day, they are more or less the same. I never said one was better in any significance, but rather talked about the technology. There is a difference. Clearly you are upset over something, and I do not know what, but I'm NOT saying that anyone should be switching or anything of the sort.

I see things from a technological value, not ancedotal evidence. Gsync, being a dedicated chipset, has potential for more and already has more in most categories. Sure, current Gsync is limited to 8-bit, but that is pointless for most gamers. HDR is still in infancy for desktop monitors and it will be for a little while more.

But again, for clarification... I am NOT saying that one should switch for any reason. Merly here for informative reasons. It is always good to know your tech.

No, they're not. If you buy cheap, expect cheap things to happen. You should be buying a certain level of quality products even if you have a limited budget, which would lead you to the smaller LG IPS or the Samsung monitors.

I'm not fishing for something to belittle you, I'm simply stating out the obvious of needing to at least buy a $600 monitor to get an IPS G-sync panel. If you already know this, then why do you insist on pointing out other things?

Most the good Freesync monitors don't have A LOT of latency added from Freesync. For sober fact, the Samsung monitors come with very little latency as their input lag is low and GTG is 1ms while being VA.

Yes, there is a difference in the technology. You pay a large premium for Nvidia's module, while not getting your entire money back in return when it comes to at least the $250-300 range of monitors.

"Gaming" is not everything and HDR is already something if you watch movies on your monitor. 10-bit shines in everyday Windows desktop usage, as I can notice the difference on my monitor between 8-bit and 10 on Corsair Link and Steam windows. The colors are better.

What are you going on about? "No they're not" about what? What I said about pricing was that Gsync is more expensive, and that is a fact. I also stated that cheaper Freesync panels don't have all the feature set that Freesync has to offer, which is also fact.

Why do I insist on pointing out other things? Because, just like I said many times before, I'm stating information. You take it as you will, and clearly you have taken it to offense.

Value is completely dependent on the individual.

Gaming may not be everything, but have you forgotten what forum you are on?

Clearly you are in favor of Freesync, and that is fine. In fact, I like the open aspect of Freesync, and hate how Gsync is closed off. Freesync would work for Nvidia gpus, but because of Nvidia caring more about protecting their proprietary method, rather than focusing on making more reason to get Gsync over Freesync (like the mindset of AMD), Nvidia has not added support for Freesync and probably never will.

Again, all of this is informational, and you can take it however you please. But do not try to make me out to be someone that is trying to get people to switch or whatever is going on in your head. Informational, that is it. Did you look at the stuff I linked? Nice and informative.
Última alteração por Revelene; 23 mai. 2018 às 14:26
TehSpoopyKitteh 23 mai. 2018 às 14:30 
Originalmente postado por Revelene:
Frame rates don't get capped with gsync. Refresh rate will be sync'd with frame rates, allowing for variable refresh rate.

Some monitors have a light indicator, some don't. Some have a refresh rate counter you can turn on in the monitor OSD. If it is working, your monitor's refresh rate will be fluctating with the same value as the frame rate.

Do note that Gsync only works within range of the max refresh rate for your monitor. After frame rate goes over your max refresh rate, you can either let it go unhindered, or enable vsync (or alternative sync).

Also, on some monitors, you have to enable Gsync in the monitor OSD as well.
In short, if you have it enabled and there is no screen tearing regardless of frame rate. Granted you would need to have a monitor that supports GSYNC in the first place.
Última alteração por TehSpoopyKitteh; 23 mai. 2018 às 14:43
John Doe 23 mai. 2018 às 14:35 
The problems wouldn't exist if you buy a decent monitor. That's what I said "no, they're not" about. It was an answer right to the first sentence. Yes, you got the fact about pricing right. Yes, cheap Freesync won't have the sync range expensive one might have, but then again you should not buy cheap. That's what it is.

I have not forgotten what forum I'm on, but HDR works on some games too.

I'm not in favor of either, I'd buy what my hardware would work with.

No, I didn't check out the video. I don't really watch that guy's videos.
Última alteração por John Doe; 23 mai. 2018 às 14:36
< >
A mostrar 1-15 de 19 comentários
Por página: 1530 50

Postado a: 22 mai. 2018 às 21:09
Comentários: 19