God of War

God of War

Ver estatísticas:
kweelo 20/dez./2021 às 19:54
Why does 30fps feel better on console than it does on PC?
For example, if you played a game that is optimized for 30fps on console, it will feel smooth with no obvious input delay or stutteriness. However if you played that same game on PC with a 30fps framerate lock, it will feel horribly jagged, stuttery, screen tearing, etc.

Why is that? I noticed this when I played Days Gone on PS4 Pro, it looked great in 4K 30fps on my tv. When I got the game on my old weak PC and locked it to 30fps, it never dropped below 30fps but it felt like it was 15-20fps just in terms of choppiness in gameplay and jitter.
< >
Exibindo comentários 6175 de 124
M&M's (Banido(a)) 8/jan./2022 às 11:08 
Escrito originalmente por 💕Mr. Prince™💕:
Its because your tv has build in smoothing its not smoother its just fake cos of your tv

Well, I don't own a TV over 15 years, so what fakes me?

It's ♥♥♥♥♥♥♥ stable frametime :)
Dopey Shepard 8/jan./2022 às 11:26 
Escrito originalmente por Poor:
Escrito originalmente por Dopey Shepard:
I can't believe SO MANY people are missing out the obvious answer:

How is this the answer? How would playing on a PS4 Pro at 30 fps on a 60 Hz TV with vsync enabled be any different from playing on a PC at the same settings?

You kinda missed the answer in the explanation. I said that windows by default sets your refresh rate to 60. Consoles work differently. Just because your TV is 60hz doesn't mean it can't produce stable 30. It just means that whatever thing that you attach to your TV will do the FPS that it was designed to do - 30 for consoles, and 60 for desktop. Does that explain your question?

Also, you can set your refresh rate on your desktop to whatever you can, as long as your monitor can support it. There are plenty of monitors that can be locked in to 30 hz but do not expect an old one (like mine) to do that. I could only overclock the poor thing to 75 and that's it, I can't set it 50 hz

If you play a game with windows locked to 60hz and have 50 fps you will see sluggish performance. If you play a game at 50 fps with windows locked to 50 hz you will see a massive difference

Not wasting your time changing the refresh rate every single time is somewhat achievable by various things like Variable Refresh Rate, FreeSync, Adaptive Half-Sync etc. but these won't be allowed on all the monitors, especially the old ones
Poor Bastard 8/jan./2022 às 12:45 
Escrito originalmente por Dopey Shepard:
Escrito originalmente por Poor:

How is this the answer? How would playing on a PS4 Pro at 30 fps on a 60 Hz TV with vsync enabled be any different from playing on a PC at the same settings?

You kinda missed the answer in the explanation. I said that windows by default sets your refresh rate to 60. Consoles work differently.

They don't work differently. The TV refresh rate doesn't change. It stays at 60 Hz. For 30 fps games, the TV will still refresh 60 times a second and each game frame will be shown twice.
KingKrouch 8/jan./2022 às 13:49 
TLDR: It just comes down to mouse inputs feeling less responsive at lower framerates (it's not a controller analog stick where they are accelerating and decelerating the camera based on the amount of time held down and how much it's held down, rather it's usually a 1 to 1 of it's inputs), and framepacing. I've seen plenty of games on consoles with really bad framerate limiters, but I suppose the best way to approach this is to optimize your game code with a slightly higher target than the cap that you want to ensure that no drops happen (I.E: If you are targeting 30FPS, try and get your settings to somewhere like 45FPS or so, so dips are less likely to happen). Also, avoid double buffered VSync if you can't hold a consistent framerate cap at all times. Usually, it's a smart idea to offer a framerate cap option that's separated from VSync (as you can't ensure that the user will have a FreeSync/GSync display or a screen refresh rate that evenly divides by 60). A framerate cap on console or handheld games is usually put into place for frametime stability or for energy consumption reasons.

I constantly see really bad takes from people who aren't game developers or haven't bothered using Google and looking at stuff that isn't places of psuedointellectual fanboy circlejerking like GameFAQs, r/Games (or any subreddit that attracts fanboys), or the comments section of any DigitalFoundry video. It's similar to the whole myth about "Borderless Windowed" modes having extra input lag than exclusive fullscreen, when if the developer decides to use Flip Model presentation, it's actually much better than exclusive fullscreen (Which has the screen blank out for a while, alongside making alt-tabbing or screen recording through OBS a pain). The only way it would have extra latency is if you use a composition mode like Copy with GPU GDI or something of that sorts. You even see people like the one of the Valorant engine developers that did some absolutely bogus testing that doesn't even take that into account, causing even more misinformation. Yet another topic that really should stop being discussed in gaming discourse, because a lot of people clearly have no clue what they're talking about.
Screen tearing means you have to enable Vsync. but the game will have free/gsync , heck, even nvidia's short latency tech which makes quicktime events more responsive.
Javi 9/jan./2022 às 7:53 
Escrito originalmente por sanchies:
Escrito originalmente por Poor:
Just stop. 30 fps will never feel smooth even with the best VRR.

I know. I played 35 hours of Okami on a true g-sync monitor.

-True G-Sync monitor don't means anything, as a high quality "FreeSync" ones can beat them easily.

-Depends on how you configure your game too.

If you played at low refresh rate (60Hz), then it run like trash It doesn't matter if it's "FreeSync", or "TRUE G-SYNC".
freesync can never beat real G-sync with module, since we are talking about software VS Hardware

It is the same as FSR VS DLSS, they are not in the same league
suboptimal 9/jan./2022 às 9:09 
Abusive conditioning.

Consoles are the abusive parent that has conditioned you to expect less and pretend it's more.
Viktor 9/jan./2022 às 14:21 
Escrito originalmente por sanchies:
The only one answer is: Framepacing.

Nothing more, nothing less.

Most new PS5 games don't even have framepacing optimization, that's why they feel sluggish. But until PS4 midgen games, they are optimized. Play Uncharted 4 on PS4 and you will see but also they have high Input Lag cost.
Every time it has high input lag, it means the game has something like Nvidia's half refresh rate vsync. It is the same like setting the display to 30hz. You get smooth and consistent 30 FPS like that, but with very bad input lag. That's not the solution, because we want 30 FPS with consistent frame pacing, without reduced input lag.
sanchies 9/jan./2022 às 16:16 
Escrito originalmente por Javi:
Escrito originalmente por sanchies:

-True G-Sync monitor don't means anything, as a high quality "FreeSync" ones can beat them easily.

-Depends on how you configure your game too.

If you played at low refresh rate (60Hz), then it run like trash It doesn't matter if it's "FreeSync", or "TRUE G-SYNC".
freesync can never beat real G-sync with module, since we are talking about software VS Hardware

It is the same as FSR VS DLSS, they are not in the same league

"True G-SYNC" advantages today:

-Variable overdrive

(Most relevant for people with VA panels or other panels with slow response times)

-It covers 0~18fps range due to hardware chip when "FreeSync" version gets disabled.

(When a G-SYNC monitor can cover 1~144Hz, range, the FreeSync version can cover 18~144Hz in the same model)

-NVIDIA Reflex (for anyone that cares)

"240Hz FreeSync" monitor beats easily any "144Hz/165Hz True G-SYNC" even in picture quality and are way cheaper, we are not on 2015 anymore. Most new monitors even o CES are FreeSync (FreeSync, VRR, G-SYNC Compatible, VESA Adaptive-Sync = Same thing) ones, few are "True G-SYNC", we don't need that anymore.

Feel free to go on Blur Busters and say them that "True G-SYNC" monitors are unbeatable, they will like, good luck.

But for FSR vs DLSS you are Right, the DLSS offer best quality.

Escrito originalmente por Viktor:
Escrito originalmente por sanchies:
The only one answer is: Framepacing.

Nothing more, nothing less.

Most new PS5 games don't even have framepacing optimization, that's why they feel sluggish. But until PS4 midgen games, they are optimized. Play Uncharted 4 on PS4 and you will see but also they have high Input Lag cost.

Every time it has high input lag, it means the game has something like Nvidia's half refresh rate vsync. It is the same like setting the display to 30hz. You get smooth and consistent 30 FPS like that, but with very bad input lag. That's not the solution, because we want 30 FPS with consistent frame pacing, without reduced input lag.

You can have Low Input Lag for 30fps with consistent frame pacing, only with VRR monitor on PC.

You can use Scanline Sync method, but your Input Lag will be higher. Not high than using NVIDIA Half Refresh, but still high than using.

Input Lag (Lower to Higher):

1st. 30fps cap with VRR on an 120Hz+ monitor.
2nd. 30fps RTSS Scanline Sync method (V-SYNC OFF)
3rd. 30fps on Consoles
4th. 30fps on PC with V-SYNC ON and RTSS 30fps cap or NVIDIA Half Refresh.
Última edição por sanchies; 9/jan./2022 às 20:40
Kaldaien 9/jan./2022 às 23:33 
Escrito originalmente por sanchies:
Escrito originalmente por Javi:
Input Lag (Lower to Higher):

1st. 30fps cap with VRR on an 120Hz+ monitor.
2nd. 30fps RTSS Scanline Sync method (V-SYNC OFF)
3rd. 30fps on Consoles
4th. 30fps on PC with V-SYNC ON and RTSS 30fps cap or NVIDIA Half Refresh.

Your 4 is very questionable. Half Refresh VSYNC will have significantly higher latency than a framerate limiter using normal VSYNC.

https://cdn.discordapp.com/attachments/778539700981071875/879905171176562688/unknown.png
https://cdn.discordapp.com/attachments/778539700981071875/879905229590650910/unknown.png

My PG27UQ running at 82 Hz, using half-refresh VSYNC to limit down to 41 FPS spends 22 ms waiting on the render queue (a full additional frame of latency) versus 3.3 ms in the queue with a framerate limiter.

It's possible RTSS 30 FPS limiter has comparable latency to half-refresh, but I doubt it. It should also be able to shave off a full frame of latency.
Última edição por Kaldaien; 9/jan./2022 às 23:34
Viktor 10/jan./2022 às 0:55 
Escrito originalmente por sanchies:
You can use Scanline Sync method, but your Input Lag will be higher. Not high than using NVIDIA Half Refresh, but still high than using.
I tried Scanline Sync before, but i just can't make it work. Those lines are always all over the place on the screen. Or i really don't understand that feature.
Javi 10/jan./2022 às 0:58 
Escrito originalmente por sanchies:
Escrito originalmente por Javi:
freesync can never beat real G-sync with module, since we are talking about software VS Hardware

It is the same as FSR VS DLSS, they are not in the same league

"True G-SYNC" advantages today:

-Variable overdrive

(Most relevant for people with VA panels or other panels with slow response times)

-It covers 0~18fps range due to hardware chip when "FreeSync" version gets disabled.

(When a G-SYNC monitor can cover 1~144Hz, range, the FreeSync version can cover 18~144Hz in the same model)

-NVIDIA Reflex (for anyone that cares)

"240Hz FreeSync" monitor beats easily any "144Hz/165Hz True G-SYNC" even in picture quality and are way cheaper, we are not on 2015 anymore. Most new monitors even o CES are FreeSync (FreeSync, VRR, G-SYNC Compatible, VESA Adaptive-Sync = Same thing) ones, few are "True G-SYNC", we don't need that anymore.

Feel free to go on Blur Busters and say them that "True G-SYNC" monitors are unbeatable, they will like, good luck.

But for FSR vs DLSS you are Right, the DLSS offer best quality.

Escrito originalmente por Viktor:

Every time it has high input lag, it means the game has something like Nvidia's half refresh rate vsync. It is the same like setting the display to 30hz. You get smooth and consistent 30 FPS like that, but with very bad input lag. That's not the solution, because we want 30 FPS with consistent frame pacing, without reduced input lag.

You can have Low Input Lag for 30fps with consistent frame pacing, only with VRR monitor on PC.

You can use Scanline Sync method, but your Input Lag will be higher. Not high than using NVIDIA Half Refresh, but still high than using.

Input Lag (Lower to Higher):

1st. 30fps cap with VRR on an 120Hz+ monitor.
2nd. 30fps RTSS Scanline Sync method (V-SYNC OFF)
3rd. 30fps on Consoles
4th. 30fps on PC with V-SYNC ON and RTSS 30fps cap or NVIDIA Half Refresh.
All this is very good but I was able to buy a lg 32gk850g-b for only 300 and it is a panel VA 165HZ with G-sync 32 "

This monitor had a cost of 850 in 2018 and for that money I think its quality is impressive and it is worth having G-syn with module
LordOfTheBread 10/jan./2022 às 1:16 
Don't know what people are about linking input lag to FPS, these are 2 very different things bound to different hardware.

High input lag feels crap no matter what platform you play on

Low FPS feels crap no matter what platform you play on

Keyboard and mouse is usually regarded as having lower input lag than controllers, thus said on both PC and consoles, granted you have a decent enough monitor or TV, is barely noticeable if noticeable at all.

Some games that are horribly coded also have input lag or command delay in that case. That said God of War never had any input lag issues even on base PS4, it had a locked 30FPS that was very bad for people used to high FPS tho and that's it.

This whole conversation deviated in people throwing tantrums at each other throwing various terms around, God of War is not a competitive game and 99.99% (and I'm generous here) of people in this thread are not pro gamers to notice a 0.01ms delay in a command input.

Stop looking at your FPS counter and your frame delay graphs and play the damn game already.
Kaldaien 10/jan./2022 às 3:45 
Escrito originalmente por Viktor:
Escrito originalmente por sanchies:
You can use Scanline Sync method, but your Input Lag will be higher. Not high than using NVIDIA Half Refresh, but still high than using.
I tried Scanline Sync before, but i just can't make it work. Those lines are always all over the place on the screen. Or i really don't understand that feature.
Special K's equivalent's way easier to use :)

https://steamcommunity.com/sharedfiles/filedetails/?id=2714850439

Just move the Scanline slider until the colored bars are solid, and you're done.
topgun43 10/jan./2022 às 13:05 
Once you run and play games at 100+ fps you'll never go back to 30fps.....even 60 fps is just OK imo.....4K gaming will not be ready for "prime time" until it can consistency hit 4K at 120hz or better.....
< >
Exibindo comentários 6175 de 124
Por página: 1530 50