how do i know if g-sync is working?
just got my new g-sync monitor and i launched skyrim and first thing i noticed is my fps wasnt capped. I checked my NVIDIA control panel and saw that g-sync had a check mark and was set to work in fullscreen applications or something like that. I didnt see anything happen in the game that looked anymore smooth than my old freesync 144hz monitor. I was wondering if i would notice if g-sync was working or not because for the amount of money i spent on a g-sync monitor im so far not so satisfied as i am happy with my 1440p and 27 inch monitor but i havent noticed much improvement with how smooth games run with the g-sync feature. Not even in pubg. Again, would i notice a difference or am i doing something wrong and is it most likely just not working and am i missing something?(and yes I am using a display port)
Dernière modification de 737382828299338; 22 mai 2018 à 21h17
< >
Affichage des commentaires 16 à 19 sur 19
The Spoopy Kitteh a écrit :
Revelene a écrit :
Frame rates don't get capped with gsync. Refresh rate will be sync'd with frame rates, allowing for variable refresh rate.

Some monitors have a light indicator, some don't. Some have a refresh rate counter you can turn on in the monitor OSD. If it is working, your monitor's refresh rate will be fluctating with the same value as the frame rate.

Do note that Gsync only works within range of the max refresh rate for your monitor. After frame rate goes over your max refresh rate, you can either let it go unhindered, or enable vsync (or alternative sync).

Also, on some monitors, you have to enable Gsync in the monitor OSD as well.
In short, if you have it enabled and there is no screen tearing regardless of frame rate. Granted uoi hve to havea monitor that supprts GSYNC in the first place.

I don't believe the driver even lets you enable Gsync in the Nvidia control panel, but that is a good point to add. Definitely need a Gsync panel to use Gsync, lol.

John Doe a écrit :
The problems wouldn't exist if you buy a decent monitor. That's what I said "no, they're not" about. It was an answer right to the first sentence. Yes, you got the fact about pricing right. Yes, cheap Freesync won't have the sync range expensive one might have, but then again you should not buy cheap. That's what it is.

I have not forgotten what forum I'm on, but HDR works on some games too.

I'm not in favor of either, I'd buy what my hardware would work with. No

No, I didn't check out the video. I don't really watch that guy's videos.

That is the problem. A lot of people go for cheap Freesync monitors. For someone like you, that doesn't buy cheap, this may be no issue for you... but for the people that do, my information may help them not go for a cheap monitor.

Yes, it does work in some games, but that goes back to my point about HDR not quite there for the masses. As far as gaming goes, there is little point to HDR right now.

Well, you say you are not in favor of them, but you are acting rather defensive.

I had two links. Not just the video, but also the rtings article. You may not "watch that guy's videos" but he goes over the feature sets for both Gsync and Freesync, and also explains that not all of the features for Freesync are available on all monitors. However, with Gsync, since it is a dedicated chipset, you get all the feature set.

At this point, I don't understand what you are trying to debate.
John Doe a écrit :
The problems wouldn't exist if you buy a decent monitor. That's what I said "no, they're not" about. It was an answer right to the first sentence. Yes, you got the fact about pricing right. Yes, cheap Freesync won't have the sync range expensive one might have, but then again you should not buy cheap. That's what it is.

I have not forgotten what forum I'm on, but HDR works on some games too.

I'm not in favor of either, I'd buy what my hardware would work with.

No, I didn't check out the video. I don't really watch that guy's videos.
HDR simulates how eyes naturally adjust to lighting conditions. This can be doen via hardware and software. Hardware adjusts pixel specific chroma brightness and software controls it. All monitors that use DP, DVI-I or DVI-D. HDMI, and other didtal display formats are capable of HDR.


Revelene a écrit :
The Spoopy Kitteh a écrit :
In short, if you have it enabled and there is no screen tearing regardless of frame rate. Granted uoi hve to havea monitor that supprts GSYNC in the first place.

I don't believe the driver even lets you enable Gsync in the Nvidia control panel, but that is a good point to add. Definitely need a Gsync panel to use Gsync, lol.

John Doe a écrit :
The problems wouldn't exist if you buy a decent monitor. That's what I said "no, they're not" about. It was an answer right to the first sentence. Yes, you got the fact about pricing right. Yes, cheap Freesync won't have the sync range expensive one might have, but then again you should not buy cheap. That's what it is.

I have not forgotten what forum I'm on, but HDR works on some games too.

I'm not in favor of either, I'd buy what my hardware would work with. No

No, I didn't check out the video. I don't really watch that guy's videos.

That is the problem. A lot of people go for cheap Freesync monitors. For someone like you, that doesn't buy cheap, this may be no issue for you... but for the people that do, my information may help them not go for a cheap monitor.

Yes, it does work in some games, but that goes back to my point about HDR not quite there for the masses. As far as gaming goes, there is little point to HDR right now.

Well, you say you are not in favor of them, but you are acting rather defensive.

I had two links. Not just the video, but also the rtings article. You may not "watch that guy's videos" but he goes over the feature sets for both Gsync and Freesync, and also explains that not all of the features for Freesync are available on all monitors. However, with Gsync, since it is a dedicated chipset, you get all the feature set.

At this point, I don't understand what you are trying to debate.
NVCP will only let you enable it if the monitor and game you use it with are compatible with GSYNC.

However, if you have a GSYNC monitor and want to manually enter what frame rates go to what refresh rates, you can use NVIDIA Profile Inspector. The end result of that is increased input lag in some cases so it is best to cap the framerate and use the highest refreshrate equal to that sustainable capped frame rate.
Dernière modification de rotNdude; 24 mai 2018 à 7h09
The Spoopy Kitteh a écrit :
NVCP will only let you enable it if the monitor and game you use it with are compatible with GSYNC.

However, if you have a GSYNC monitor and want to manually enter what frame rates go to what refresh rates, you can use NVIDIA Profile Inspector. The end result of that is increased input lag in some cases so it is best to cap the framerate and use the highest refreshrate equal to that sustainable capped frame rate.

Never tried to enable Gsync, without a Gsync monitor, so that is good to know.

Games don't have to support Gsync for it to work. I use Gsync with older games, which definitely do not have Gsync support.

I don't personally use Inspector. When I have frames over my refresh rate, I typically let it go unhindered, but occasionally use a frame limiter just under my max refresh rate with RTSS. or use Fast Sync. Works pretty well for me.
The Spoopy Kitteh a écrit :
John Doe a écrit :
The problems wouldn't exist if you buy a decent monitor. That's what I said "no, they're not" about. It was an answer right to the first sentence. Yes, you got the fact about pricing right. Yes, cheap Freesync won't have the sync range expensive one might have, but then again you should not buy cheap. That's what it is.

I have not forgotten what forum I'm on, but HDR works on some games too.

I'm not in favor of either, I'd buy what my hardware would work with.

No, I didn't check out the video. I don't really watch that guy's videos.
HDR simulates how eyes naturally adjust to lighting conditions. This can be doen via hardware and software. Hardware adjusts pixel specific chroma brightness and software controls it. All monitors that use DP, DVI-I or DVI-D. HDMI, and other didtal display formats are capable of HDR.

Software based HDR and hardware based are on different levels. Sure, Source supported HDR all the way along 15 years ago, but if you ever saw a high end TV enabling and disabling hardware level HDR in split screen on a store, you would know about the difference.

This however also comes down on which dimming technology the panel uses, along with how many nits of brightness it has. The higher it is, the better the HDR experience tends to be.

High end TVs that do this, like some Samsung are FALD and do HDR better than most PC monitors. Speaking of, Samsung just added Freesync to some of their new higher end TVs with a patch.
< >
Affichage des commentaires 16 à 19 sur 19
Par page : 1530 50

Posté le 22 mai 2018 à 21h09
Messages : 19