Інсталювати Steam
увійти
|
мова
简体中文 (спрощена китайська)
繁體中文 (традиційна китайська)
日本語 (японська)
한국어 (корейська)
ไทย (тайська)
Български (болгарська)
Čeština (чеська)
Dansk (данська)
Deutsch (німецька)
English (англійська)
Español - España (іспанська — Іспанія)
Español - Latinoamérica (іспанська — Латинська Америка)
Ελληνικά (грецька)
Français (французька)
Italiano (італійська)
Bahasa Indonesia (індонезійська)
Magyar (угорська)
Nederlands (нідерландська)
Norsk (норвезька)
Polski (польська)
Português (португальська — Португалія)
Português - Brasil (португальська — Бразилія)
Română (румунська)
Русский (російська)
Suomi (фінська)
Svenska (шведська)
Türkçe (турецька)
Tiếng Việt (в’єтнамська)
Повідомити про проблему з перекладом
One, framerate is entirely subjective as to what's smoothest, or acceptable or pleasing or whatever. One person is fine with 30 FPS, another not. SO you CANNOT put a finite figure on it except by experienting yourself. There is NO answer there.
Two, as others have said, it doesn't amtter what the maths are behind this, the simple rule is that unless you choose a framerate that is directly divisible to what your GPU and monitor like, then you can get tearing and other artifacts.
So find what that framerate is and find a multiple that pleases you without issues.
The point is well taken. I have done some experiments with afterburner and have determined that Halo Reach at 1920x1080, for example runs pretty at a pretty stable rate of 70 fps, with small dips to 69 with intense action. I tried 72, which is half the refresh rate or my monitor and while it was still good, the dips went a little lower to 67. At 1368x768, I achieved a stable 83 FPS. I am trying to use frame rates whose ms comes as close to a round integer number as possible.
Sniper Elite V 2 remastered runs great @1080p, 91 FPS locked. I don't know what was different about that game, but it was one of the few that remained constant.
I have tested many and my results vary wildly. For a I-4470k, 1060 3gb 16 GB RAM rig, 1080p, what would you suggest is generally the sweet spot for a high, dynamic frame rate?
Simple - what works for you.
There is NO finite number that's kind of the point. Or at least we don't know your system and your software thereon. So it's only a number that it pertinent to YOUR situation.
Your refresh rate of your monitor is what you should aim for - pure and simply. Worrying about 1 or two FPS here and there is utterly pointless as you're doing.
Just set it to the same as your monitor refresh rate and leave it at that.
The few FPS you claim are varying wildly simply aren't.
Are you seeing massive screen tearing at all? If not, then leave it alone. You're getting too hung up on numbers and not focusing on what matters - how it looks to YOU.
To me, 72fps looks pretty solid. It may be the framerate I like the most and will stick with.
On a separate note, I would like to understand the process by which the various developers rounded the milisecond integer. I have seen two methods, from 2 or three down to the nearest number, e.g. 62=16.1 and 90=11.1. Others, like my 144 fps monitor round up the number to 6.9 to make it 7 miliseconds.
I am not too familiar with frame rate calculations. Between rounding up the integer to the next number and down to the nearest, which process is more efficient in the modern day?
It's a myth, it's been tested about 45 times, and it's well known it's a myth.
UGH.
Anything above 60FPS, your eyes might not perfectly see, but that also ties into how good your eyesight is. :P
Yeah, that's what I do too. I don't even know the hotkeys/functions/whatever to show fps...
For myself, I also have a rather old monitor -- no variable framerates or anything; I have no idea what the rules are for those.
For a standard 60fps monitor, 60fps will do fine. Anything above is a complete waste of time; as far as I'm concerned, I don't care whether it's actually 60 or lower. I've run one benchmark a while back, for Tomb Raider 2013, to figure out settings for my Radeon 5770 card; nowadays I'm sporting a GTX 1060 so I don't need benchmarks like that anymore.
Technically, and if you can actually spot such differences, if you can't hit 60fps on such a monitor, you'd want to run on 30fps. Why is that? Because your monitor displays 60 images per second no matter what; if your game delivers 30 images per second, it means the monitor will show every game image twice, giving a smoother experience than randomly showing some images once and other images twice.
Or even flipping images once they are ready (i.e. running without VSync -- a big no-no if you want a decent experience), as that will display the old image in the upper part of the monitor, and a different image in the lower part, causing horizontal tearing of the displayed image. It might increase the fps number shown, but reduce overall output quality very noticeably... so you'd trade animation quality for a useless benchmark number.
Its not an universal number.
One game i played fine with 45fps. And another game plays fine with 75 (and i guess 60), but unplayable with 45.
What i never found, a game that "needs" more than that.
You'd have to ask each deve for their own personal ways to approach this and what thir standards are.
But you will NOT get a finite answer as it differes because as I said before, it's not quite what you think it is. As long as it looks reasonable and plays, that's it.
But if you want those details, hunt some people down and ask them.
Muppet's point about some games being good at a certain framerate while the same rate is yuck on another I've experienced too. It obviously boils down to how they code and what they tie to that framerate. I've never liked the idea of tying framerate with physics as that can make a bloody mess (or hilarious messes if you watch speedruns).
For myself I've also had issues with the old screen tearing that plagued certain games around the Unreal 3 (?) era. The PS3 especially.
As I have about 7 TVs in my house (laughingly, only 1 for watching TV) I've played around with this a fair bit, and I can play one game on the PS3 on a certain TV and get awful screen tearing then not at all on another. It's obviously down to framerates and more importantly circuits and trickery, but it's still worth mentioning as even though things may stay the same the results can still very just by being a different device.
One last thing I would like to know is how to choose frame approximation or frame rounding in milliseconds. In the number 62, the millisecond counter is 16.1 and 16.0 at 62.5. The same happens st 90 and 91, 100 and 101, too. All these numbers are processed at either one number under the next number, or one number after, in the milisecond counter, or in the case of 100 FPS, 10.0.
Which of these two milisecond counter models offer better performance in modern games?
Well, Activision believed 91 FPS at 10.9 was best for older games at the peak of the classic games era to correct any stutters or spikes during high image processing. Infinity Ward and Treyarch were of the same opinion for some of their games too.
ID Software thought Doom 3 nailed it at 62.5 FPS at 16.0 ms, then other companies contested it with 62 or even 63.
The 144hz frequency at 6.9 believes the same. All this is contrasted with the 240 Hz model at 4.1ms, or the 120hz at 8.3.
The new Chinese monitor at 500hz at 2ms brings us to a round figure.
I understand the nature of the beast, no problem there. What I would like to know, though, is which milisecond rounding is, for current games with high dynamics, more reliable.
They are not fractions of something real. Just lengths of time to the next image. Shorter or "longer".
Look, if it really mattered, you would know by now.