God of War
Why does 30fps feel better on console than it does on PC?
For example, if you played a game that is optimized for 30fps on console, it will feel smooth with no obvious input delay or stutteriness. However if you played that same game on PC with a 30fps framerate lock, it will feel horribly jagged, stuttery, screen tearing, etc.

Why is that? I noticed this when I played Days Gone on PS4 Pro, it looked great in 4K 30fps on my tv. When I got the game on my old weak PC and locked it to 30fps, it never dropped below 30fps but it felt like it was 15-20fps just in terms of choppiness in gameplay and jitter.
< >
Visualizzazione di 76-90 commenti su 124
Messaggio originale di Kaldaieℵ₀:
Messaggio originale di sanchies:

Your 4 is very questionable. Half Refresh VSYNC will have significantly higher latency than a framerate limiter using normal VSYNC.

https://cdn.discordapp.com/attachments/778539700981071875/879905171176562688/unknown.png
https://cdn.discordapp.com/attachments/778539700981071875/879905229590650910/unknown.png

My PG27UQ running at 82 Hz, using half-refresh VSYNC to limit down to 41 FPS spends 22 ms waiting on the render queue (a full additional frame of latency) versus 3.3 ms in the queue with a framerate limiter.

It's possible RTSS 30 FPS limiter has comparable latency to half-refresh, but I doubt it. It should also be able to shave off a full frame of latency.

Maybe the "Half Refresh" method put then in 5th Place, but yes, its the worst.

Your monitor is 144Hz, you don't need to lower to 82Hz when running a lower framerate games, unless you are having image quality issues like smearing, ghosting, crosstalk or other "strobing problems" otherwise, lowering your refresh rate will increase your Input Lag.

High Refresh Rate + VRR (G-SYNC) = Quick Frame Transport.



Messaggio originale di Javi:
Messaggio originale di sanchies:

"True G-SYNC" advantages today:

-Variable overdrive

(Most relevant for people with VA panels or other panels with slow response times)

-It covers 0~18fps range due to hardware chip when "FreeSync" version gets disabled.

(When a G-SYNC monitor can cover 1~144Hz, range, the FreeSync version can cover 18~144Hz in the same model)

-NVIDIA Reflex (for anyone that cares)

"240Hz FreeSync" monitor beats easily any "144Hz/165Hz True G-SYNC" even in picture quality and are way cheaper, we are not on 2015 anymore. Most new monitors even o CES are FreeSync (FreeSync, VRR, G-SYNC Compatible, VESA Adaptive-Sync = Same thing) ones, few are "True G-SYNC", we don't need that anymore.

Feel free to go on Blur Busters and say them that "True G-SYNC" monitors are unbeatable, they will like, good luck.

But for FSR vs DLSS you are Right, the DLSS offer best quality.



You can have Low Input Lag for 30fps with consistent frame pacing, only with VRR monitor on PC.

You can use Scanline Sync method, but your Input Lag will be higher. Not higher than using NVIDIA Half Refresh, but still higher than using VRR method.

Input Lag (Lower to Higher):

1st. 30fps cap with VRR on an 120Hz+ monitor.
2nd. 30fps RTSS Scanline Sync method (V-SYNC OFF)
3rd. 30fps on Consoles
4th. 30fps on PC with V-SYNC ON and RTSS 30fps cap or NVIDIA Half Refresh.
All this is very good but I was able to buy a lg 32gk850g-b for only 300 and it is a panel VA 165HZ with G-sync 32 "

This monitor had a cost of 850 in 2018 and for that money I think its quality is impressive and it is worth having G-sync with module

The G-SYNC module helps a lot on monitors that use VA panel, fighting the "smearing" but as technologies improve, newer screens are not always necessary, like on Samsung Odyssey G7/G9 or newer IPS/TN panels.



Messaggio originale di LordOfTheBread:
Don't know what people are about linking input lag to FPS, these are 2 very different things bound to different hardware.

High input lag feels crap no matter what platform you play on

Low FPS feels crap no matter what platform you play on

Keyboard and mouse is usually regarded as having lower input lag than controllers, thus said on both PC and consoles, granted you have a decent enough monitor or TV, is barely noticeable if noticeable at all.

Some games that are horribly coded also have input lag or command delay in that case. That said God of War never had any input lag issues even on base PS4, it had a locked 30FPS that was very bad for people used to high FPS tho and that's it.

This whole conversation deviated in people throwing tantrums at each other throwing various terms around, God of War is not a competitive game and 99.99% (and I'm generous here) of people in this thread are not pro gamers to notice a 0.01ms delay in a command input.

Stop looking at your FPS counter and your frame delay graphs and play the damn game already.

I don't know why people insist on thinking that Input Lag makes a difference only in "E-Sports" games.

Apparently you must never have played an emulator in your life and these are the ones that suffer the most Input Lag due to old games being programmed to run on CRT screens that are analog signal without processing and even on modern PS4 what I see most is people complaining about Input Lag in Red Dead Redemption 2 or even in Bloodborne or other From Software games.

Even Uncharted 4 have an input lag when your TV is set to Game Mode, you will see how the PC port will perform better due to people running on a faster monitors (60Hz on Console and 120Hz+ on PC, not even counting the VRR smoothness).

TLDR: The same framerate can have different Input Lag measure, depending on how you configure your game and the capacity of your monitor/TV.



Messaggio originale di Kaldaieℵ₀:
Messaggio originale di Viktor:
I tried Scanline Sync before, but i just can't make it work. Those lines are always all over the place on the screen. Or i really don't understand that feature.
Special K's equivalent's way easier to use :)

https://steamcommunity.com/sharedfiles/filedetails/?id=2714850439

Just move the Scanline slider until the colored bars are solid, and you're done.

Very good to see the evolution of the Scanline Sync method, it was pure trial and error the older method and took a lot of time adjusting game by game and still not having a 100% consistent framepacing. Thanks.

Messaggio originale di silkyslugs:
For example, if you played a game that is optimized for 30fps on console, it will feel smooth with no obvious input delay or stutteriness. However if you played that same game on PC with a 30fps framerate lock, it will feel horribly jagged, stuttery, screen tearing, etc.

Why is that? I noticed this when I played Days Gone on PS4 Pro, it looked great in 4K 30fps on my tv. When I got the game on my old weak PC and locked it to 30fps, it never dropped below 30fps but it felt like it was 15-20fps just in terms of choppiness in gameplay and jitter.


https://www.youtube.com/watch?v=Zc8NWxeYcKM

Well this is how it runs Days Gone here on my potato machine. According to the people here in the topic, "it's running like garbage". There is still stuttering here or there, due to the game being installed on HDD.
Ultima modifica da sanchies; 10 gen 2022, ore 16:32
Because smart TV have motion lag, now you can try play console in monitor that's same like play in PC.
Console recommended play on 4K TV because motion lag much helpful in 30fps games. 30fps in TV motion lag like 45fps.

But God of War in PS5 running 4K/60fps and visual upgrade, I don't want buy GOW pc version just waste my money :)
Messaggio originale di sanchies:

Messaggio originale di LordOfTheBread:
Don't know what people are about linking input lag to FPS, these are 2 very different things bound to different hardware.

High input lag feels crap no matter what platform you play on

Low FPS feels crap no matter what platform you play on

Keyboard and mouse is usually regarded as having lower input lag than controllers, thus said on both PC and consoles, granted you have a decent enough monitor or TV, is barely noticeable if noticeable at all.

Some games that are horribly coded also have input lag or command delay in that case. That said God of War never had any input lag issues even on base PS4, it had a locked 30FPS that was very bad for people used to high FPS tho and that's it.

This whole conversation deviated in people throwing tantrums at each other throwing various terms around, God of War is not a competitive game and 99.99% (and I'm generous here) of people in this thread are not pro gamers to notice a 0.01ms delay in a command input.

Stop looking at your FPS counter and your frame delay graphs and play the damn game already.

I don't know why people insist on thinking that Input Lag makes a difference only in "E-Sports" games.

Apparently you must never have played an emulator in your life and these are the ones that suffer the most Input Lag due to old games being programmed to run on CRT screens that are analog signal without processing and even on modern PS4 what I see most is people complaining about Input Lag in Red Dead Redemption 2 or even in Bloodborne or other From Software games.

Even Uncharted 4 have an input lag when your TV is set to Game Mode, you will see how the PC port will perform better due to people running on a faster monitors (60Hz on Console and 120Hz+ on PC, not even counting the VRR smoothness).

TLDR: The same framerate can have different Input Lag measure, depending on how you configure your game and the capacity of your monitor/TV.

So God of War is a simulator or does need to be as tight as one all of a sudden? And Uncharted might had input lag bui that never ever prevented anyone to play the damn game did it?

Any modern TV that's not fron an unknown brand and any gaming monitor will have unoticeable input lag for the common Joe.
Messaggio originale di LordOfTheBread:
Messaggio originale di sanchies:



I don't know why people insist on thinking that Input Lag makes a difference only in "E-Sports" games.

Apparently you must never have played an emulator in your life and these are the ones that suffer the most Input Lag due to old games being programmed to run on CRT screens that are analog signal without processing and even on modern PS4 what I see most is people complaining about Input Lag in Red Dead Redemption 2 or even in Bloodborne or other From Software games.

Even Uncharted 4 have an input lag when your TV is set to Game Mode, you will see how the PC port will perform better due to people running on a faster monitors (60Hz on Console and 120Hz+ on PC, not even counting the VRR smoothness).

TLDR: The same framerate can have different Input Lag measure, depending on how you configure your game and the capacity of your monitor/TV.

So God of War is a simulator or does need to be as tight as one all of a sudden? And Uncharted might had input lag bui that never ever prevented anyone to play the damn game did it?

Any modern TV that's not fron an unknown brand and any gaming monitor will have unoticeable input lag for the common Joe.

I see only excuses because now PS5 players are arrogant at level that they only talk about 60fps now they have a powerful console running crossgen games, when in the not too distant past, they saw "no difference between 30 and 60fps" because the PS4's processor limited the potential of most games.

But still my point is not to say that "common Joe" will not have a satisfactory experience, the purpose of the topic is not that, the OP's question was in relation to why 30fps is smooth on Consoles and the same 30fps is not fluid on PC.

Anyway, the PC people will play God of War in the best way, due to the freedom to configure their games, regardless of whether you are a "common Joe" or an enthusiast who has the best equipment or an advanced user who gets to develop wrappers.

Anyway, we'll see this discussion heat up when Elden Ring comes out next month which I'm sure will come with Framepacing issues knowing the way From Software games work, starting with the 60Hz lock in Full Screen.
isso que e foda
Messaggio originale di sanchies:
I see only excuses because now PS5 players are arrogant at level that they only talk about 60fps now they have a powerful console running crossgen games, when in the not too distant past, they saw "no difference between 30 and 60fps" because the PS4's processor limited the potential of most games.

But still my point is not to say that "common Joe" will not have a satisfactory experience, the purpose of the topic is not that, the OP's question was in relation to why 30fps is smooth on Consoles and the same 30fps is not fluid on PC.

Anyway, the PC people will play God of War in the best way, due to the freedom to configure their games, regardless of whether you are a "common Joe" or an enthusiast who has the best equipment or an advanced user who gets to develop wrappers.

Anyway, we'll see this discussion heat up when Elden Ring comes out next month which I'm sure will come with Framepacing issues knowing the way From Software games work, starting with the 60Hz lock in Full Screen.

I don't own a PS5 nor I have any intention to buy one, the OP question is utterly stupid because 30FPS feels crap on any platform compared to 60, I personally play PC because I have access to way more games and I am not tied to a single selling platform, I have infinite retro-compatibility, I can use emulators and I'm not limited in my input methods.

I used to spend a ton of time tweaking options in my game to squeeze the most FPS I could, nowadays I just launhc games and play them as long as they run smooth and guess what? I do enjoy games more doing just that because the difference between 117fps and 125 eludes me anyway, like about everyone.

From Soft games suffered from input lag because of programming (and because the ports were piss poor) and it is another story entirely, I had no issues playing HZD on PC myself using both keyboard and mouse or a controller for example.

Once again this conversation has strayed off the nonsensical OP saying that 30FPS feels better on consoles than on PC, it does not, moving on.
Though I am primarily a PC player, I do have a Switch. And understand that on console, it's not really about graphics. Which is okay if the gameplay is nice. But I play Val at 144fps etc. I have no interest in owning a playstation especially when horizon zero dawn, days gone and this being on PC
Messaggio originale di Hipocondria:
Messaggio originale di LordOfTheBread:

Why anyone would want to do that?

because there are poor people in the world too, rich boy.

Not reading the whole thread, but another reason to lock 30fps on games is to avoid breaking physics, ffxv's is not proper above 30fps sadly, 60 is nicer even though the clothing and hair physics get messed up I guess, but eh...
Well, the truth are much less technical than what these people are trying to explain to you. 7 years back the medical department at out university was trying to implement VR in its education, The prototype device had a 30 fps refresh rate and nobody in the department can use it more than 15 minute at a time without severe motion sickness. So as they were trying to figure out the solution they did a research project in parallel to document the findings.

In the end they found out the type of controller device used for viewing determines whether the user gets motion sickness. If an XBOX360 controller were used to control view, ~30fps minimum is enough to eliminate motion sickness, if mouse were used to control view, ~60fps minimum is enough, if VR headset were used to control view, ~90 fps minimum is enough.

The hypothesis is this issue were caused by 2 different but linked factors: Factor 1 is the dead zone on the device , the mini joysticks on the controller seemed to have some kind of dead zone implemented, this delay in control response time can significantly reduce motion sickness in the user at low fps using a controller. However, when they implemented similar dead zone to mouse and VR headset, it induced motion sickness no matter what the FPS was. So this is where Factor 2 comes in the equation: the muscle group used to control the device. If muscle groups involved in bodily balance (arm/shoulder, back/neck) were used to control the device, it induces motion sickness immediately at low FPS across all devices tested deadzone or no dead zone. To further prove this, they strapped the some user's arm to the mini joystick so their arm muscles can be used for view control, it indeed induced motion sickness at low fps and it cannot be compensated by implement deadzone.

Well, I know it's a lot. But the 2 conclusions are simple. Conclusion 1 is to eliminate motion sickness at low fps (~30), device deadzone + use muscle groups NOT involved in bodily balance (hand/thumbs muscles) to control the device. Conclusion 2 is if muscle groups involved in bodily balance (arm/shoulder, back/neck) is used for device control , a minimum of 60fps (arms) or 90fps (vr) is required, and deadzone does nothing but exacerbate the motion sickness. I believe the paper went deeper to discuss the coordination of eye and cerebellum and the use of balance muscle and they even scored muscle based on its degree of involvement in maintain balance. And something about sporadic balance muscles (arms/shoulders) and constitutive balance muscles (neck/back) have different level of sensitivity to "disturbances" like low fps.
TV's upscale 30 fps to higher refresh rates adding smoothing, for a movie this does not matter as who cares about input delay when watching a movie, however with games its very noticeable however some may prefer that upscaling at cost of better fps feel.
I've seen God of War on PS4 on my TV screen. Now I sit relatively close to it. Have a comfy chair in front of the TV, not even 2 meters from the screen, and I could clearly tell that God of War only ran at 30 FPS. There is no difference to a PC game running at 30 FPS. When sitting close to the screen, 30FPS always sucks. no matter what platform.
Messaggio originale di sanchies:
Messaggio originale di Kaldaieℵ₀:

Your 4 is very questionable. Half Refresh VSYNC will have significantly higher latency than a framerate limiter using normal VSYNC.

https://cdn.discordapp.com/attachments/778539700981071875/879905171176562688/unknown.png
https://cdn.discordapp.com/attachments/778539700981071875/879905229590650910/unknown.png

My PG27UQ running at 82 Hz, using half-refresh VSYNC to limit down to 41 FPS spends 22 ms waiting on the render queue (a full additional frame of latency) versus 3.3 ms in the queue with a framerate limiter.

It's possible RTSS 30 FPS limiter has comparable latency to half-refresh, but I doubt it. It should also be able to shave off a full frame of latency.

Maybe the "Half Refresh" method put then in 5th Place, but yes, its the worst.

Your monitor is 144Hz, you don't need to lower to 82Hz when running a lower framerate games, unless you are having image quality issues like smearing, ghosting, crosstalk or other "strobing problems" otherwise, lowering your refresh rate will increase your Input Lag.

High Refresh Rate + VRR (G-SYNC) = Quick Frame Transport.
I would not be caught dead running that monitor at 144 Hz, it would limit me to chroma subsampling :) 82 Hz is the highest you can go with DisplayPort 1.4 bandwidth at 4K w/ HDR on.
Messaggio originale di 76561198026681638:
Messaggio originale di sanchies:

Maybe the "Half Refresh" method put then in 5th Place, but yes, its the worst.

Your monitor is 144Hz, you don't need to lower to 82Hz when running a lower framerate games, unless you are having image quality issues like smearing, ghosting, crosstalk or other "strobing problems" otherwise, lowering your refresh rate will increase your Input Lag.

High Refresh Rate + VRR (G-SYNC) = Quick Frame Transport.
I would not be caught dead running that monitor at 144 Hz, it would limit me to chroma subsampling :) 82 Hz is the highest you can go with DisplayPort 1.4 bandwidth at 4K w/ HDR on.

I confidently doubt you could tell the difference between 444 and 422 in a blind study. Yes, 4:2:0 is noticeably ♥♥♥♥ but 4:2:2 is virtually lossless. Unless you are specifically looking at text with a magnifying glass, you aren't going to tell it's subsampled.
Messaggio originale di Kaldaieℵ₀:
Messaggio originale di sanchies:

Maybe the "Half Refresh" method put then in 5th Place, but yes, its the worst.

Your monitor is 144Hz, you don't need to lower to 82Hz when running a lower framerate games, unless you are having image quality issues like smearing, ghosting, crosstalk or other "strobing problems" otherwise, lowering your refresh rate will increase your Input Lag.

High Refresh Rate + VRR (G-SYNC) = Quick Frame Transport.
I would not be caught dead running that monitor at 144 Hz, it would limit me to chroma subsampling :) 82 Hz is the highest you can go with DisplayPort 1.4 bandwidth at 4K w/ HDR on.

That's sad to hear mate. I don't like having to lower chroma subsampling too.

In this case it is also understandable.
One word: framepacing. Games on console are coded to a very specific framerate output that is non-variable. Games will produce only a specific set of frames to the processor without delay, which causes the smoothness. This is also why you get the jerkiness in some games like Bloodborne on console, because they send an uneven amount of full frames that the CPU will not render.

Put simply, the games are designed to put out a perfect amount of frames to accomodate a perfectly smooth image which is demanded by the console. PC though, sends a variable set of frames to the processor which adds up to an actually uneven image.
Ultima modifica da Unmotivated; 11 gen 2022, ore 16:05
< >
Visualizzazione di 76-90 commenti su 124
Per pagina: 1530 50