Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The OP is correct: You're gonna notice little difference between 30 FPS and 60 FPS. I'm running my stuff on a beast, which outputs to a 4K 120-kHz wide-screen monitor. Yet I run most games at 30 FPS.
What? I'll miss the motion blur on a passing train? SOMEONE SAVE ME!
That I won't get my "money's worth" because a couple of settings need to be ratcheted down? Puh-LEEZE.
Do y'all think CO built this game on Cray III supercomputers? And then foisted it on an unsuspecting community relying on off-the-shelf hardware?
Of course, AMD, Intel, and the other hardware giants LOVE to see these arguments since it causes folks to buy faster (and more expensive) hardware that will be obsolete in a few months. As the old saying goes, it's the software that drives the hardware.
get a 144hz screen than u learn
Well... I smell what I'm stepping in here.
It seems that this train is the advertisement train for CS11 (Deal With It) train. I'm about to deal with it alright... I'm down rating my review. Thanks for showing me the truth.
ELY5: You're talking about movie standards with tightly controlled constant frame rates, and this is not a movie, it's a video game with an uneven frame rate and uneven processing requirements where averages poorly represent actual performance or where even a steady frame rate of 30 can still generate significant motion and screen discordance, which people are variably sensitive to.
ELY25: The research you're citing relates to movie playback on a large screen from a fixed position. The standard comes from a time period when footage of film was extremely expensive, and the fixed location of the screen relative to the viewer (movie theater or large TV-style setting) resulted in a frame rate which is too low being commonly jarring and a frame rate that is too high being slightly nauseating for some people.
The 30/24 frame standard we commonly see in movie formats is a compromise there.
Forget that anyone ever said that people can't see more than 30/60 fps or that it's an ideal for everyone. It's absolutely not the case and occasionally someone puts out a 48 framerate cut of a movie or higher (like The Hobbit) and you can definitely see the extra frames and some people definitely feel sick because of it.
But that's because it's from a fixed location where the body of the watching consumer doesn't have other input. The nausea comes from conflicting inputs from the body (also a cause of nausea in VR applications) that create sensory discordance. In a video game, you are controlling the motion, and that completely changes the sensory experience by providing the body compensating information. That results in a more comfortable view at a higher framerate for most people since the added smoothness of the motion better conforms to how the eye sees in real motion in concordance (the eye does not have a framerate, it accepts constant information from what it's looking at, and in frames that's infinite frames... so more smoothness in the context of a video game's video results in a better experience for the player).
Basically, it's about the fact that you intentionally moving the view results in a different expectation from the brain relating to the availability of image information.
Now, each frame above the ideal space for any given person has diminishing returns. The difference between 30 and 60 is a lot more than between 100 and 200, but what people can see depends on the individual (as we all have different eyes and different neurologies that function at different levels of ability). Your ideal may be between 30 and 60. Probably most people's... but not everyone's and especially not in a situation where you're controlling the motion.
There's also the little problem of computer monitors.
Unlike projectors and CRTs, computer monitors have a refresh rate that has to sync and various technologies to smooth and unite the image (from the GPU) and the monitor frame rate exist to solve this problem, but they only work reliably above the 40-60 frame range.
FreeSync won't sync well below that. G-sync will sync tightly fairly low, but still suffers from problems with playback below 40-60, as does v-sync because fundamentally the smoothing to match the monitor frame rate's cycle requires an excess of frames to hit the ideal target, and this results in tearing, ghosting, etc... but even if synced, the misalignment at a low framerate will still be jarring below 40 fps.
Basically, their target does not conform to how flat panel monitors ideally sync frames.
Now, if they reliably hit a 30 fps baseline and we didn't have hitching and other visual oddities and clear signs of processing strain, that would probably be fine for most people... and there are frame chasers out there who just care about the frames count.
I'm not one of them. I would be fine with a consistent 30 fps baseline. Having said that, I can definitely tell when I'm panning or moving the camera in-game even with settings dialed in that there's variable loading and processing issues in the engine. There's hitching, and a lack of general smoothness to the visual image on moving, and while my eyes basically stop picking up significant visual smoothness above around 100fps, in this game even pushing 50fps provides a sub-standard outcome even with top of the line last gen hardware, which should perform better than that given the relatively poor quality of the visuals in this game.
And I think that last part is a big part of the problem and why people are up in arms. I look at the game and I don't know where those frames are going. It could just be the scope of the game currently demands heavy performance to render from the GPU, but the game's kinda bland when it comes to the cost to run. People will put up with a lot as far as performance issues are concerned if the game looks good... and sometimes it does, but usually it's pretty bland. So in the end it's really about people not seeing where the tradeoff is in the final outcome, and then being told that "30 fps is the target, live with it, we'll tell you what you can and can't see" kind of set people off. It's one thing to say they're working on making things better, it's another to talk down to a customer. That may not be what they were trying to do, but it's kind of how it came out.
Would't it be more efficient to focus computing power on blending or to have a high fps but interlace the image so only half of the pixels are rendering at any given time?
However, if you are watching standard definition video on tape, or you are playing older video games, they look better on a CRT and this is a hill I will die on.
There's a distinct different between 30 and 60 FPS, which is the reason why games offer both a performance mode which generally targets 60 FPS vs a fidelity mode at 30.
The overall annoyance from people is that with the level of hardware it should be hitting at least 60 FPS and the reason it isn't is because of poor optimisation (as it was rushed out the door to meet paradoxes delivery date). Even the highest end hardware with a 13900k and RTX 4090 becomes GPU bottlenecked at large city sizes.
The research I am referring to the theory of persistence of vision, which argues that there is an upper limit to how many images we can see, and that it is likely below 60 but the researchers frequently fight about it. This research hasn't always been from a film perspective or from a fixed location.
But the reason we have the fps we do now is something I'm really familiar with, and I went with those standards to avoid the film issue. Most early video games were running with video fps and I'm not sure when that fully stopped.
I do video and film preservation and this is a debate that we have periodically about if more is better. Also, standardization does cause issues with playback. Early cinema was mainly restricted by how quickly people could wind film manually so its often around 12 to 18 fps. This is why early movies look like everyone is moving funnily and too fast. Animation is fun because they used fewer frames and doubled several. With 24 frames, I agree that it was to cut costs, but with video the fps was chosen because of electrical standards. Also due to mechanical limitations of video heads. Technically with NTSC black and white is 30 fps and color its 29.97 fps but no one should have to remember that and with PAL or SECAM both b/w and color are 25 fps as the color values are encoded differently.
Most born digital stuff sticks with these standards or might bump it a little, but there are logistical challenges if they don't, such as with digital projectors or players (in a broad sense of the word) that are expecting broadcast standards.
I just feel like the issue that is being fixed by bumping the frame rate way up aren't truly related to fps and might be better fixed by interlacing or by optimizing the compression algorithms so that information could be transmitted more efficiently.