Batman™: Arkham Knight

Batman™: Arkham Knight

View Stats:
HomeChicken Sep 21, 2017 @ 7:21pm
V-SYNC is the problem - G-SYNC is the solution
I love this game, and I believe I have figured out the rendering problems. In short, V-SYNC is completely broken, so if you have a monitor with G-SYNC it will work awesome (90 fps), but otherwise you are kind of screwed (stuck at 23 fps, regardless of graphics settings.

I have played this on both (i) an i5 / Geforce 970 (4GB) desktop hooked up to an HDTV or (ii) an i7 / 970M *laptop* that has a monitor with G-SYNC.

Basically, when you have V-SYNC turned ON, you will be stuck at 23 fps (for the desktop it was 23 fps with either ALL of the video settings at HIGHEST, all NVIDIA effects ON) or 23 fps (for the desktop with ALL of the video settings on LOW, all NVIDIA effects OFF).

When V-SYNC is turned OFF, then the fps with the desktop is ~55 fps (HIGHEST graphiics setiings) or ~120 fps (LOW graphics settings)... BUT the SCREEN TEARING (with V-SYNC off) is utterly TERRIBLE. (like make you puke terrible)

So in other words, V-SYNC is COMPLETELY BROKEN.

Now, if you have a G-SYNC monitor, then just turn V-SYNC off, and the G-SYNC hardware will fix all of your problems and the game is AWESOME. (The laptop above runs it at ~70 fps with relatively high settings, even though the 970M (2GB) laptop card is not even close to as fast as the 970 (4GB) desktop card.)

I hope this helps, since this one variable (broken V-SYNC) has been responsible for 99% of all of the performance problems I've seen.

< >
Showing 1-12 of 12 comments
lucasj1210 Sep 21, 2017 @ 7:40pm 
maybe, but not everyone has a G-Synch monitor or HDTV
for the most part to get highest FPS and least tearing..
works fine if you
make max FPS =00.0000 in BmSystemSettings.ini(make 'read only' after changing)
turn in game vsynch OFF
run game in borderless windowed
turn Nvidia vsynch to ADAPTIVE(Nvidia control panel...make a separate profile for AK)
Last edited by lucasj1210; Sep 22, 2017 @ 10:47am
Roast Goose Sep 21, 2017 @ 8:47pm 
I realise this might come across as shilling, but I've been using G-sync as well and it really does make a massive difference, especially in this game.
Buck Sep 22, 2017 @ 9:47am 
Originally posted by HomeChicken:
V-SYNC is the problem - G-SYNC is the solution

G-sync is just hardware implemented adaptive V-sync. (effectively)

/thread ?

I love this game, and I believe I have figured out the rendering problems.

Oh this should be good. *Grabs popcorn.

In short, V-SYNC is completely broken

orly?

so if you have a monitor with G-SYNC it will work awesome (90 fps), but otherwise you are kind of screwed (stuck at 23 fps, regardless of graphics settings

Basically, when you have V-SYNC turned ON, you will be stuck at 23 fps (for the desktop it was 23 fps with either ALL of the video settings at HIGHEST, all NVIDIA effects ON) or 23 fps (for the desktop with ALL of the video settings on LOW, all NVIDIA effects OFF).

lolno, just..no.

When V-SYNC is turned OFF, then the fps with the desktop is ~55 fps (HIGHEST graphiics setiings) or ~120 fps (LOW graphics settings)... BUT the SCREEN TEARING (with V-SYNC off) is utterly TERRIBLE. (like make you puke terrible)

So in other words, V-SYNC is COMPLETELY BROKEN.

I'd ask you what orifice you pulled this nonsense out of , but it's pretty obvious. You seriously just have NO idea what you are talking about, at all.

I hope this helps, since this one variable (broken V-SYNC) has been responsible for 99% of all of the performance problems I've seen.

If that's true then this is just proof of just how extremely limited your personal experience is. Everything you are saying is clearly evident of a person who has NO real idea wtf he's talking about.


Originally posted by Roast Goose:
I realise this might come across as shilling, but I've been using G-sync as well and it really does make a massive difference, especially in this game.

No one is saying g-sync doesn't work, because it does, but it's not a cure all and it's not an option for everyone. Adaptive v-sync serves the same basic purpose for those of us that do not have G-sync capable monitors.
Last edited by Buck; Sep 22, 2017 @ 9:47am
HomeChicken Sep 22, 2017 @ 2:04pm 
Hey fellas,

So to clarify, the above are the empirical *RESULTS* of somewhat extensive testing of the game on two different computers that I happen to own, so it's not really a question of whether or not I "know" what I'm talking about.

As noted above, the bad thing about G-SYNC is that it is hardware that is built into a monitor to cause the refresh rate to match the clock of the graphics card, so any monitor that has this built in will be more expensive (~$200) than a monitor that doesn't have it. It's really too bad, since most programs are able to use V-SYNC without any problems.

As for the Arkham Knight game, it's a fantastic *game* (art / story /gameplay, etc.), but the PC port was simply DISASTROUS (well documented article here: https://www.pcworld.com/article/2940412/batman-arkham-knight-how-bad-are-the-issues-pretty-bad.html ... but there are *many* others like it...).

As far as I can tell, the programers pretty much threw their hands up and said "No soup for you! 30 fps if you use V-SYNC."). So its just lazy.

The G-SYNC hardware is either built into a monitor or it isn't, so you can't modify a monitor etc. to make it work. (as an aside AMD also worked on a similar idea but i doubt it would work for this program in the same way, e.g., see here: https://www.rockpapershotgun.com/2015/04/09/g-sync-or-freesync-amd-nvidia/

The Good News is that the game is still pretty playable at ~30 fps, but the car-chase scenes are *much* easier with higher framerates.

One thing to note is that the testing was done based on altering the settings in the in-game options (e.g., framerate cap 60 or 90; V-SYNC - on/off/variable). Also, the framerate test was done using the in-game "PC Performance Test" test in the options that shows the fps in the upper left corner). I'm not goint to lie, but when I took the desktop i5/970 down to the LOWEST settings with V-SYNC on and the result was still stuck at 23-24 fps, I was somewhat amazed. Again, turning the V-SYNC off completely fixed this issue (very high fps) but screen tearing was just INSANE (like, unplayable insane).

Lucas -- thanks for thoughts, I will try seeing if that makes any difference. Since turning the framerate cap to 60 or 90 in game did not make any difference, I am doubtful. But, hey, the in-game V-SYNC is *so* broken here, I may try it just for fun. (Note, since I am running the 970 to an HDTV via an HDMI the HDMI signal tops out at 60 fps, and then the tv hardware fills in every other frame to arrive at 120 Hz (120 fps)). But, again, just for fun, I'll give it a try. In my tests there was no difference between Adaptive and On for the V-SYNC.

Buck -- So you are correct that G-SYNC is hardware that prevents tearing, but the way that it works is a bit different than you seem to describe. (see here: https://www.geforce.com/hardware/technology/g-sync/faq ). Whether you are venting / cracking jokes / trying to troll for fun, it's all good to me. ;-) And it is true that not everyone has G-SYNC, which, honestly, shouldn't really be needed for any game to work reasonably well.

Anyhoo, if you can just accept the game for what it is you can still have a lot of fun with it (even if, in at least some ways, it was an epic FAIL by the coders who tried to port it to PC). And it's still a good deal. So there's that.

Have fun. Don't drink and drive. And peace out fellas.
Last edited by HomeChicken; Sep 22, 2017 @ 2:04pm
Buck Sep 22, 2017 @ 3:54pm 
Originally posted by HomeChicken:
As far as I can tell, the programers pretty much threw their hands up and said "No soup for you! 30 fps if you use V-SYNC."). So its just lazy.

No, you are both wrong and that's not how v-sync works. Yes, there are many types of bugs/issues that can prevent v-sync from working as expected, but those are hardly unique to this game and they are absolutely NOT the result of a design decision. V-sync works fine with this game for me.

The G-SYNC hardware is either built into a monitor or it isn't, so you can't modify a monitor etc. to make it work.

that's not true at all. NVIDIA used to product module upgrade kits which could be retrofitted to non- g-sync ready monitors, but they are no longer available and haven't been for years.

(as an aside AMD also worked on a similar idea but i doubt it would work for this program in the same way, e.g., see here: https://www.rockpapershotgun.com/2015/04/09/g-sync-or-freesync-amd-nvidia/

Freesync is simply a different way to implement the same basic idea. The difference difference is that freesync isn't proprietary.

Buck -- So you are correct that G-SYNC is hardware that prevents tearing

Really? I wasn't sure.

/eyeroll.

but the way that it works is a bit different than you seem to describe. https://www.geforce.com/hardware/technology/g-sync/faq

I didn't "describe" anything, but there is nothing there which contradicts what I did say. Boiled down it's exactly what I said. don't get bogged down in the marketing fluff.

Whether you are venting / cracking jokes / trying to troll for fun,

It's a complete mystery.

And it is true that not everyone has G-SYNC, which, honestly, shouldn't really be needed for any game to work reasonably well.

Obviously both AMD and NVIDIA think otherwise or the technologies wouldn't exist.
Jac Sep 22, 2017 @ 10:20pm 
I run on a 970 and don't have a particular issue with the game. In that 95% of the time I get Vsynced 60FPS. That is with maxxed graphics but with enhanced smoke and debris both off - these two settings are too much for a 970 to maintain 60FPS at 1080p. I only see very occasional dips and stutters and this is almost always when I am tearing around the city in the Batmobile so I would guess it is an asset streaming problem.

I do think Gsync is a good technology though, most especially when you are near but cannot maintain 60FPS in a game. The tech allowing the game to drop to 40 - 50FPS and not introduce tearing. I always play with VSync because I cannot stand tearing.
HomeChicken Sep 24, 2017 @ 8:52am 
Jac -- Interesting... I get those kind of results *only* with V-SYNC off. Totally agreed that not having a V-Sync of some variety is a big fat "no go" for playing a game...


Buck --
that's not true at all. NVIDIA used to product module upgrade kits which could be retrofitted to non- g-sync ready monitors, but they are no longer available and haven't been for years.

Shockingly, this is apparently correct. (in 2014) But these were only applicable to ONE *specific* monitor (Asus VG248QE). So not really gonna work out mot of time.

Nonetheless, I am correct about how G-Sync works. "Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (Hertz). But due to the dynamic nature of PC games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the display, persistent tearing occurs... G-SYNC eliminates this tradeoff, perfectly syncing the display to the GPU, regardless of frame rate" (cite https://www.geforce.com/hardware/technology/g-sync/faq)

I fully recognize that having friends who actually work for Nvidia kind of puts me at an unfair advantage here, but it's all good.

AND... Don't forget to have fun on your upcoming 13th birthday!!!!!!!! HAHAHAHAHA!!! ;-)

Last edited by HomeChicken; Sep 24, 2017 @ 1:37pm
Jac Sep 24, 2017 @ 9:05am 
Nvidia definitely did produce an upgrade path for specific monitors to make them Gsync capable.

https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-DIY-Upgrade-Kit-Installation-and-Performance-Review
HomeChicken Sep 24, 2017 @ 1:23pm 
Ha!!! Sweet! I stand corrected!! Thanks Jac! (Not surprised that they quit making them though.)
Last edited by HomeChicken; Sep 24, 2017 @ 1:24pm
HomeChicken Sep 24, 2017 @ 1:32pm 
...BUT (and it'a a big but) apparently it was ONLY made for 1 very specific monitor: Asus VG248QE.

This kind of limits the utility here, but it was a good concept anyway!
Jac Sep 24, 2017 @ 11:46pm 
Yeah IIRC there were modules for a few types of monitor but they quickly realised it was a stupid idea and now if you want a Gsync monitor you have to buy a Gsync monitor.
HomeChicken Sep 25, 2017 @ 4:18pm 
Agreed -- Just looking at all of the parts and expecting someone to take apart their monitor (assuming they even have the right *one*) and insert a new board is kind of a hilariously bad idea. Also, the upgrade kit was like $300+ (or something like that), whereas a new monitor with G-Sync is only about an additional $200 (not cheap but way better than an additional $300-400). If someone is REALLY smart they will start installing G-SYNC in HDTVs, but *as far as I know* this has not happened YET....

Cheers Jac!!
< >
Showing 1-12 of 12 comments
Per page: 1530 50

Date Posted: Sep 21, 2017 @ 7:21pm
Posts: 12