安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
RE Village on ps4 at 45 fps as you say is probably just hardware limitation
If you want to lock your framerate just go into your graphics card control panel and set it to whatever you want.
Simple as that.
Limiting below 60 has no use.
There was a time when Developers said "We have it on 30 FPS for a more cinematic feel" which is just horrible.
So match the FPS to your monitor refresh rate and call it a day.
Hell, I feel the difference between 60 FPS/Hz and 75 (with G-Sync for extra smoothness) in Killing Floor 2 (which I admittedly play a lot meaning I may be overly sensitive to that tiny difference).
Look, I'm fine with The Witcher 3 running in 30 FPS on the Switch because that's still The Witcher 3 with most of it's polygons intact on-the-go. As much as I love blowing that PC-masterrace-horn, it's got nothing on the Switch when it comes to mobile gaming.
From my personal experience, 60fps is fluid, below that anything is playable down to 40 fps, below that I feel input delay and games become hard to play, sometimes unplayable. If input delay isn't happening, I can stand 35-40 fps but I see choppiness.
My monitors are max 60fps @4k resolution (below too) so I set max fps to 60 and turn on vsync in many games to ensure lack of tear. Of course my computer can't run all the games at 60fps, but that's my personal ideal framerate. On the other hand I'm not one to push games to 120fps (even if I could) because I'd like to save power, my eyes can't see more than 60fps anyway. I didn't experience anything weird with physics when vsync is turned on. Some games don't do well with it, meaning they have stutter, so I turn off vsync and allow tearing. It's very rare, but those games exist.
...... other than increasing a lower value.
But nothing about reducing to increase.
I'd like to set my fps to a fixed value in order to experiment with the cinematic, or the older pc fps. I read that once all games made by ID and other first person shooters locked their games at 35 FPS, namely because of the old 70hz monitors. As a player who grew up with two generations of games, I am interested to understand the different frame rates to better reason how they evolved from the 90's up.
To avoid to redone all the logic they lock the frame rate to 30.
With PC games, unless the game needs 30 fps for the above reasons, you should play at least at 60. While console games are build to look good at 30, PC games are built to look good at 60 minium, and they doesn't look well at 30, 36 or 45.
As many other suggested you, lock your frame rate to your monitor refresh rate or below. Common monitor refresh rate are 60, 75 and 144.
Try to not go under 60, as i've said modern games are designed to run and look well on 60 or over, and can feel sluggish or choppy on lower frame rates.
I don't understand why, but it's enough to know that you yourself have an interest in this and are not misunderstanding something other people said.
BTW I have played many ID games and honestly I don't remember one having been capped at less than 60. With 3dfx and q2 deathmatch we got very fluid animation. Some games were capped at 60, yes. But these options were settable. I still have Q3 arena on my ssd which says 'seta com_maxfps "85"'.
Anyway, how to go about setting max framerate if the game itself doesn't offer such option:
Frames lock in older game was arbitrary. At the time there wasn't accelerated hardware for the video and no multithreading. The game logic and the video were processed in one go by the CPU.
This is why older game are unplayable on modern hardware even when emulated if the processing speed is not limited.
When CPU started to became more powerful developers had to choose how much frame they wanted to output every second and then "pause" the processing between each frame, to avoid to have the game running at excessive speed on more performant hardware.
When external accelerated hardware and multithreading became reality everything changed, but that's another story :)
I see. Thank you for sharing this information about classic game logic.
I am developing a better understanding of modern frame rates and I understand that higher frame rates are better. I would like to now understand which numbers are considered realistic.
One user says there is complete smoothness at 85 FPS, another says the stuttering completely drops off at 90, others say 120 FPS is the way to for a truly enjoyable experience and, another says 200 is the most realistic in the sense that you feel you are moving.
I have tried them all, barring 200 and am unsure what is the most life like FPS. Can you please tell me which one is?
I mean let's start with cinematic 24FPS? What's so special about it? Is 24FPS some optimal viewing value they researched and arrived at? So why 24FPS?
And then in the case of console, they tend to run at sub 60FPS or sub 30FPS not because of some inspired decisions, not because cinematic FPS is a great target. It's still just a compromise between what the user can tolerate, what the budget hardware can manage, and what the developers think is important. The compromise between graphic quality, screen resolution and FPS. They can obfuscate things as much as they want. If a PS4 could run Resident Evil Village at 60FPS at max settings, they'd do that. 45FPS gives them pretty good performance and pretty good visuals given the PS4's limitations.
There isn't one. Your eye isn't a camera, different parts of your eye see at different FPS. There's a lot of variables involved. And then it's all wired up to brain that's doing a thousand hacky things to efficiently process and approximate the data. It's why optical illusions are a thing, evidence of your brain trying it's best but falling over a little bit.
I will say that more FPS is generally better. However the concept of diminishing returns apply. The higher you go, sure there's still some benefit, you may be able to see it. At some points it's just good enough. Doubling something that's already excellent just doesn't have the same impact anymore. Going from 30 to 60 FPS is huge. Go from 90 to 120 FPS is a lot smaller. Because 30FPS is just barely tolerable and 90FPS is a pretty stellar experience for most games, more is nice, but you'll still be OK with only 90FPS.
And at 90FPS if you dip into the 80's or the 70's you'll still be having a good experience. But dipping from 30FPS into the 20's or teens is pretty mediocre experience because you're already near the floor.
In general I'd argue a consistent experience is best. A solid 80FPS is probably better than bouncing around between 70-99FPS.
At the end of the day it's all subjective and whatever feels best to you is the right answer.
Under 24 frame you can cleary distinguish each frame when they are drawn on the screen.
That's means 30 FPS should be enough to play, and many old games on PC and even newer console games are rendered at 30 fps.
But the human eye can move, and when the eyes are focused on a moving object they try to follow it on the screen, making the low frame rate more evident. This is where stuttering and jerkiness starts to appear.
How much frame per seconds are needed are mostly tied to how much fast the objects are moving on the screen. Slow moving objects are easily tracked at low fps, but fast moving objects aren't, because they litteraly seems to "jump" across the screen.
So the answer to you question is "it depends on the game you want to play". Games with slow moving objects can be played at 30 fps without issues, but more fast paced game requires 60 or more.
Usually 60 FPS is enough for every game, until you are playing a competitive first person shooter at professional level with a top tier monitor. Then 144 frames are to be preferred.
Theorically the human eye can a moving object up to 200/230 frame per second, but in my opinion going over 144 is pointless because there is very little gain.
A very insightful reply. Thank you for taking the time to write it. I can see FPS in terms of gaming is a more complex thing.
Taking into account the need for games to be rendered at a suitable frame rate, a number of games have made a common denominator a defacto setting. Mafia 2002 was capped at 63 FPS, Doom 3 was capped at 62.5, Batman Arkham series until Knight was capped at 62 and so was the first Chivalry. All these games used engines that determined that two frames over sixty offered the best balance of smoothness and graphical quality, that much is easy to see, but what was the need for those two extra frames over an already decent amount of frames? No game that I know of in recent years had it. Did they do something to enhance the frame rate?