Cities: Skylines II

Cities: Skylines II

View Stats:
Littlephrenic Oct 30, 2023 @ 5:52pm
3
Please explain like I'm 5 why 100 fps is desirable.
I do not understand the appeal.
Why is approximately 30 (NTSC) or 25 (PAL/SECAM) fps not sufficient? Are you turning every second of game play into slow mo footage?
It isn't even like with audio where increasing the sampling rate from 44.1 kHz to 96 kHz does increase the amount of information being captured that can actually be heard, although even then 96 is overkill in a lot of cases and the increase between 96 and 192 is probably overkill in almost every situation.
Increasing the frame rate past like 60 fps isn't perceptible to the eye. It just seems that it would be better to concentrate on the resolution, bit depth, color values, contrast, or other metrics that actually are able to be seen with the human eye.
How is this being rendered out? Does it help smooth over lag times? Why does it matter for a city sim game?
< >
Showing 16-30 of 107 comments
K.I.L.E.R Oct 30, 2023 @ 7:17pm 
Originally posted by /AUSTRIA\Bomberman:
Originally posted by Kruno Saho:
the human eye can only see 1 frame per second
u are a smoothbrain than
my bran is smoother than urs
TheAceOfSpodes Oct 30, 2023 @ 7:23pm 
Occasionally I'll see the difference between 30-60, like with a fast zoom or pan of the camera. But the people ♥♥♥♥♥♥♥♥ about not having 100+, don't really care. I can see virtually no difference between 60 and 100.
Ricebug Oct 30, 2023 @ 7:25pm 
If I paid attention to the number crunchers, I'd be waiting a year to buy CSII, until the devs "fixed" the game. I listened to my gut instead, however, and went with CO's recommended settings. I am not disappointed. The game is awesome. Once public beta-testing is in the bag, the devs will roll up their sleeves and CSII will become even greater.

The OP is correct: You're gonna notice little difference between 30 FPS and 60 FPS. I'm running my stuff on a beast, which outputs to a 4K 120-kHz wide-screen monitor. Yet I run most games at 30 FPS.

What? I'll miss the motion blur on a passing train? SOMEONE SAVE ME!

That I won't get my "money's worth" because a couple of settings need to be ratcheted down? Puh-LEEZE.

Do y'all think CO built this game on Cray III supercomputers? And then foisted it on an unsuspecting community relying on off-the-shelf hardware?

Of course, AMD, Intel, and the other hardware giants LOVE to see these arguments since it causes folks to buy faster (and more expensive) hardware that will be obsolete in a few months. As the old saying goes, it's the software that drives the hardware.
Last edited by Ricebug; Oct 30, 2023 @ 7:27pm
Originally posted by TheAceOfSpodes:
Occasionally I'll see the difference between 30-60, like with a fast zoom or pan of the camera. But the people ♥♥♥♥♥♥♥♥ about not having 100+, don't really care. I can see virtually no difference between 60 and 100.
its like old ppl saying: "why flatscreen if i have my big old working junk?" lol

get a 144hz screen than u learn
SumGumption Oct 30, 2023 @ 7:34pm 
Originally posted by Ricebug:
I'm running my stuff on a beast, which outputs to a 4K 120-kHz wide-screen monitor. Yet I run most games at 30 FPS.

What? I'll miss the motion blur on a passing train? SOMEONE SAVE ME!

That I won't get my "money's worth" because a couple of settings need to be ratcheted down? Puh-LEEZE.

Well... I smell what I'm stepping in here.

It seems that this train is the advertisement train for CS11 (Deal With It) train. I'm about to deal with it alright... I'm down rating my review. Thanks for showing me the truth.
Lukario Oct 30, 2023 @ 7:36pm 
nobody wants 100 fps, they just want to play without it dropping from like 30 to 10 randomly and without stutters, take paradox's balls out of your mouth it's cringe
Cuco Oct 30, 2023 @ 7:47pm 
Because they dont know how that magic box that cost him like 2k run fortnite at 200 fps and other games dont. Heck they dont even know that games are desing diferents from one to another. They think a pc is playstation with steroids.
GnuHorn Oct 30, 2023 @ 7:55pm 
Originally posted by Littlephrenic:
I do not understand the appeal.
Why is approximately 30 (NTSC) or 25 (PAL/SECAM) fps not sufficient?

ELY5: You're talking about movie standards with tightly controlled constant frame rates, and this is not a movie, it's a video game with an uneven frame rate and uneven processing requirements where averages poorly represent actual performance or where even a steady frame rate of 30 can still generate significant motion and screen discordance, which people are variably sensitive to.

ELY25: The research you're citing relates to movie playback on a large screen from a fixed position. The standard comes from a time period when footage of film was extremely expensive, and the fixed location of the screen relative to the viewer (movie theater or large TV-style setting) resulted in a frame rate which is too low being commonly jarring and a frame rate that is too high being slightly nauseating for some people.

The 30/24 frame standard we commonly see in movie formats is a compromise there.

Forget that anyone ever said that people can't see more than 30/60 fps or that it's an ideal for everyone. It's absolutely not the case and occasionally someone puts out a 48 framerate cut of a movie or higher (like The Hobbit) and you can definitely see the extra frames and some people definitely feel sick because of it.

But that's because it's from a fixed location where the body of the watching consumer doesn't have other input. The nausea comes from conflicting inputs from the body (also a cause of nausea in VR applications) that create sensory discordance. In a video game, you are controlling the motion, and that completely changes the sensory experience by providing the body compensating information. That results in a more comfortable view at a higher framerate for most people since the added smoothness of the motion better conforms to how the eye sees in real motion in concordance (the eye does not have a framerate, it accepts constant information from what it's looking at, and in frames that's infinite frames... so more smoothness in the context of a video game's video results in a better experience for the player).

Basically, it's about the fact that you intentionally moving the view results in a different expectation from the brain relating to the availability of image information.

Now, each frame above the ideal space for any given person has diminishing returns. The difference between 30 and 60 is a lot more than between 100 and 200, but what people can see depends on the individual (as we all have different eyes and different neurologies that function at different levels of ability). Your ideal may be between 30 and 60. Probably most people's... but not everyone's and especially not in a situation where you're controlling the motion.

There's also the little problem of computer monitors.

Unlike projectors and CRTs, computer monitors have a refresh rate that has to sync and various technologies to smooth and unite the image (from the GPU) and the monitor frame rate exist to solve this problem, but they only work reliably above the 40-60 frame range.

FreeSync won't sync well below that. G-sync will sync tightly fairly low, but still suffers from problems with playback below 40-60, as does v-sync because fundamentally the smoothing to match the monitor frame rate's cycle requires an excess of frames to hit the ideal target, and this results in tearing, ghosting, etc... but even if synced, the misalignment at a low framerate will still be jarring below 40 fps.

Basically, their target does not conform to how flat panel monitors ideally sync frames.

Now, if they reliably hit a 30 fps baseline and we didn't have hitching and other visual oddities and clear signs of processing strain, that would probably be fine for most people... and there are frame chasers out there who just care about the frames count.

I'm not one of them. I would be fine with a consistent 30 fps baseline. Having said that, I can definitely tell when I'm panning or moving the camera in-game even with settings dialed in that there's variable loading and processing issues in the engine. There's hitching, and a lack of general smoothness to the visual image on moving, and while my eyes basically stop picking up significant visual smoothness above around 100fps, in this game even pushing 50fps provides a sub-standard outcome even with top of the line last gen hardware, which should perform better than that given the relatively poor quality of the visuals in this game.

And I think that last part is a big part of the problem and why people are up in arms. I look at the game and I don't know where those frames are going. It could just be the scope of the game currently demands heavy performance to render from the GPU, but the game's kinda bland when it comes to the cost to run. People will put up with a lot as far as performance issues are concerned if the game looks good... and sometimes it does, but usually it's pretty bland. So in the end it's really about people not seeing where the tradeoff is in the final outcome, and then being told that "30 fps is the target, live with it, we'll tell you what you can and can't see" kind of set people off. It's one thing to say they're working on making things better, it's another to talk down to a customer. That may not be what they were trying to do, but it's kind of how it came out.
Last edited by GnuHorn; Oct 30, 2023 @ 8:03pm
Littlephrenic Oct 30, 2023 @ 7:57pm 
Originally posted by Dezzy:
Simple, it looks better in motion.

Also, people need to stop with "human eye can't see more than 60 fps." when it's not true. Playing at 120+ fps is much smoother than 60.
Is it smoother because of how the computer is rendering and blending frames? Because high fps in film or video isn't visible and most consumer and mid level professional digital projectors are unable to project at varied fps anyway.
Would't it be more efficient to focus computing power on blending or to have a high fps but interlace the image so only half of the pixels are rendering at any given time?
Littlephrenic Oct 30, 2023 @ 8:05pm 
Originally posted by /AUSTRIA\Bomberman:
Originally posted by TheAceOfSpodes:
Occasionally I'll see the difference between 30-60, like with a fast zoom or pan of the camera. But the people ♥♥♥♥♥♥♥♥ about not having 100+, don't really care. I can see virtually no difference between 60 and 100.
its like old ppl saying: "why flatscreen if i have my big old working junk?" lol

get a 144hz screen than u learn
Well no, a pixel based flat screen has measurably different specifications than a CRT and can have a much higher resolution. It also has a different aspect ratio, and color values, and weird math about square or rectangle pixels. Watching something in 4k is going to be sharper than watching something in standard definition.
However, if you are watching standard definition video on tape, or you are playing older video games, they look better on a CRT and this is a hill I will die on.
Cuco Oct 30, 2023 @ 8:09pm 
Originally posted by Littlephrenic:
Originally posted by /AUSTRIA\Bomberman:
its like old ppl saying: "why flatscreen if i have my big old working junk?" lol

get a 144hz screen than u learn
Well no, a pixel based flat screen has measurably different specifications than a CRT and can have a much higher resolution. It also has a different aspect ratio, and color values, and weird math about square or rectangle pixels. Watching something in 4k is going to be sharper than watching something in standard definition.
However, if you are watching standard definition video on tape, or you are playing older video games, they look better on a CRT and this is a hill I will die on.
I work in a privste cinema with a 4k projector. I play 1080p movies and i tell the people is 4k. They dont note the diference. The only real use of 4k or 8k or whatever k is for vr headset when physical pixel density is visible (aka screendoor effect)
Cuco Oct 30, 2023 @ 8:12pm 
Oh and the human eye dont see in frame per seconds. The human eye is not a camera.
playboi14 Oct 30, 2023 @ 8:13pm 
If someone has to explain the difference between 30 FPS and 60 FPS when it comes to fluidity and smoothness and why it's preferable then I don't think you're discussing the issue in good faith.

There's a distinct different between 30 and 60 FPS, which is the reason why games offer both a performance mode which generally targets 60 FPS vs a fidelity mode at 30.

The overall annoyance from people is that with the level of hardware it should be hitting at least 60 FPS and the reason it isn't is because of poor optimisation (as it was rushed out the door to meet paradoxes delivery date). Even the highest end hardware with a 13900k and RTX 4090 becomes GPU bottlenecked at large city sizes.
Cuco Oct 30, 2023 @ 8:20pm 
Originally posted by playboi14:
If someone has to explain the difference between 30 FPS and 60 FPS when it comes to fluidity and smoothness and why it's preferable then I don't think you're discussing the issue in good faith.

There's a distinct different between 30 and 60 FPS, which is the reason why games offer both a performance mode which generally targets 60 FPS vs a fidelity mode at 30.

The overall annoyance from people is that with the level of hardware it should be hitting at least 60 FPS and the reason it isn't is because of poor optimisation (as it was rushed out the door to meet paradoxes delivery date). Even the highest end hardware with a 13900k and RTX 4090 becomes GPU bottlenecked at large city sizes.
Well the simulation is made on the cpu not in the gpu. So how is gpu bottleneck?
Littlephrenic Oct 30, 2023 @ 8:30pm 
Originally posted by GnuHorn:
Originally posted by Littlephrenic:
I do not understand the appeal.
Why is approximately 30 (NTSC) or 25 (PAL/SECAM) fps not sufficient?

ELY5: You're talking about movie standards with tightly controlled constant frame rates, and this is not a movie, it's a video game with an uneven frame rate and uneven processing requirements where averages poorly represent actual performance or where even a steady frame rate of 30 can still generate significant motion and screen discordance, which people are variably sensitive to.

ELY25: The research you're citing relates to movie playback on a large screen from a fixed position. The standard comes from a time period when footage of film was extremely expensive, and the fixed location of the screen relative to the viewer (movie theater or large TV-style setting) resulted in a frame rate which is too low being commonly jarring and a frame rate that is too high being slightly nauseating for some people.

The 30/24 frame standard we commonly see in movie formats is a compromise there.
I really appreciate your explanation and I'm not going to quote all of it.
The research I am referring to the theory of persistence of vision, which argues that there is an upper limit to how many images we can see, and that it is likely below 60 but the researchers frequently fight about it. This research hasn't always been from a film perspective or from a fixed location.
But the reason we have the fps we do now is something I'm really familiar with, and I went with those standards to avoid the film issue. Most early video games were running with video fps and I'm not sure when that fully stopped.
I do video and film preservation and this is a debate that we have periodically about if more is better. Also, standardization does cause issues with playback. Early cinema was mainly restricted by how quickly people could wind film manually so its often around 12 to 18 fps. This is why early movies look like everyone is moving funnily and too fast. Animation is fun because they used fewer frames and doubled several. With 24 frames, I agree that it was to cut costs, but with video the fps was chosen because of electrical standards. Also due to mechanical limitations of video heads. Technically with NTSC black and white is 30 fps and color its 29.97 fps but no one should have to remember that and with PAL or SECAM both b/w and color are 25 fps as the color values are encoded differently.
Most born digital stuff sticks with these standards or might bump it a little, but there are logistical challenges if they don't, such as with digital projectors or players (in a broad sense of the word) that are expecting broadcast standards.
I just feel like the issue that is being fixed by bumping the frame rate way up aren't truly related to fps and might be better fixed by interlacing or by optimizing the compression algorithms so that information could be transmitted more efficiently.
< >
Showing 16-30 of 107 comments
Per page: 1530 50

Date Posted: Oct 30, 2023 @ 5:52pm
Posts: 107