Zainstaluj Steam
zaloguj się
|
język
简体中文 (chiński uproszczony)
繁體中文 (chiński tradycyjny)
日本語 (japoński)
한국어 (koreański)
ไทย (tajski)
български (bułgarski)
Čeština (czeski)
Dansk (duński)
Deutsch (niemiecki)
English (angielski)
Español – España (hiszpański)
Español – Latinoamérica (hiszpański latynoamerykański)
Ελληνικά (grecki)
Français (francuski)
Italiano (włoski)
Bahasa Indonesia (indonezyjski)
Magyar (węgierski)
Nederlands (niderlandzki)
Norsk (norweski)
Português (portugalski – Portugalia)
Português – Brasil (portugalski brazylijski)
Română (rumuński)
Русский (rosyjski)
Suomi (fiński)
Svenska (szwedzki)
Türkçe (turecki)
Tiếng Việt (wietnamski)
Українська (ukraiński)
Zgłoś problem z tłumaczeniem
However, I capped the FPS to 40 and I find it very playable with framegen on.
With vsync disabled + reflex on the latency is not that big of a deal, unless you're playing with mouse and keyboard.
Again, not justifying Capcom here, but in my experience (at least in this game) framegen is not that bad at low fps.
It's a pretty straightforward metaphor. Freamgen is a nice-to-have, but isn't the actual base of the experience. If the game isn't running at a playable framerate without framegen, then it won't become playable *with* framegen, becuase hte extra frames are purely cosmetic. So having framegen turned on when you don't have the "cake" to put it on is going to be an unplesant experience.
I can break this down a bit further since just saying "it adds latency" without explaining is gonna be confusing, so bear with me explaining some obvious stuff up front. Let's imagine a simple reflex game, the screen will turn comlpetely red on a randomly chosen frame and then you want to click to turn hte screen blue as fast as possible. We're going to futher simplify this by pretending we're on a monitor with a perfect 0 ms response time, we're not factoring in the input latency of any devices, we just want to focus on framerate.
60 FPS is 16.66 ms per frame, 30 FPS is 33.33 ms per frame, and so on. So if you see a a red flash on the screen for 1 frame at 30 FPS, it can take up to 30.33 ms from when you hit the button to when you actually see your input turn the screen blue. If you're playing at 60 FPS, however, the maximum delay can only be 16.66 ms per frame, so *just* by increasing the FPS we have halved our worst case senario for input latency.
Now, framegen obviously puts out fake frames, it can't *really* react to your inputs. So if we had a theoretically perfect framgen implementation, we could have our test logically running at 30 FPS but appearing to run at 60 FPS. But we're still constricted by the real frames to actually respond to our inputs, so even though the neat little animation playing on the screen might look as though it's running at 60 FPS, our input delay worst case is still 33.33 ms per frame isntead of 16.66 ms per frame, right? Simple enough.
"But wait, I can't run the test at 60 FPS, so what harm is there in having the nicer animation?" Good question. In reality ,we don't have a perfect framegen implementation that *actually* just doubles the FPS. Instead, framegen, in order to create in-between frames for our game, has to have access to *both* the current frame *and* the previous frame, and then spend some amount of time processing both to get the resulting in-between frame. So we're always going to be at a minimum 1 frame behind what the game has *actually* put out, because in order for that fake frame to be displayed the game would have to have already made the next frame and has to wait for the fake frame's turn to end.
Now, in addition to this, the image generation itself actually takes up resources and isn't instant, so the AI needs to sit there "thinking" after it's gotten the next frame. And this causes your *actual* framerate to drop, so instead of doubling from 30 FPS to 60 FPS, you might actually be going down to 25 FPS and then doubling that to 50 FPS, so while the FPS is increased the logical framerate is lower.
So when you go to play agame, especially something sensitive like a shooter (ie, playing a ranged weapon in Monster Hunter), it can feel *weird* seeing a smooth display on your screen but then notice it takes longer for you to move your cursor when you aim, weirdly sluggish, as though you were playing at 30 FPS. That in itself can be disorienting.
And then, finally and perhaps most dramatically, AI is prone to hallucinations. The more you ask an AI to fill in the gaps, the more room you're giving it to hallucinate. At a higher framerate, the difference between two frames is quite small, and so the AI isn't really given much room to mess things up, and so the final result tends to be fine. But at a low framerate, the AI is being given a lot more time to make things up, and this results in what's called "ghosting" - monsters spawn two jaws, your sword on your back has a forked blade all of a sudden, the gears in the furnace change into a weird oblong shape. They pop in and out every other frame in a flickering effect as the AI generated frames alternate with the real ones, and it's just weird things that shouldn't be there that can be viscerally upsetting, characters spawn an extra ghost finger for a split second.
And that's worse visually than just playing at a lower resolution without framegen, framegen isn't helpful if it is not *accurate* and if you're using framegen to go from 20 FPS to 30 FPS then you're likely to run into these kidns of issues on top of the game feeling sluggish. It would genuinely be better to just suffer at 20 FPS where what you see on screen is actually there and you're not getting extra input latency on top of what's already there from the low framerate.
To give a real world number, 15 ms is a typical expected added latency from adding framegen, which can spike up to 100 ms if you hit your display's refresh rate limit. A bad TV can have around 80-ish ms, so framegen can at points make even a nice gaming monitor feel like a bad TV, and it can make a bad TV utterly insufferable. Whether that 15 ms is worth it is going to vary by game. Monster Hunter is in an awkward spot for framegen even if the game didn't perform so poorly, because it's not an esports title where any amount of extra delay is completely unacceptable and will make you lose against people who have better equipment, but it's also not a chill exploration game with only mild action elements. There's parries, dodges with i-frames, aiming slingers and guns and bows, there's a lot of stuff that is reliant on skilled input and the series as a whole can get quite difficult and demanding (though Wilds so far seems quite a bit easier, oh well), but it's not demading *enough* to where people can just say "this is never OK." It's enough to tempt peopel to try it and then have them get frustrated when it messes something up, and of course it's in this sweet spot where people on the internet will argue about it.
In framegen's defense, though, it's really nice for games that have an arbitrary FPS lock, like emulated games or really bad PC ports. If Elden Ring is already locked at 60 FPS no mater what, then it's not very hard to get it to run at 120 FPS with framegen for a much more fluid look without it seriously impacting the feel of the game. Old school MH games were locked at 30 FPS and they look dramatically better running at 60 FPS - or even 120 FPS with multi-framegen, because you're not even close to using your system's full resources to run the game.
Some say you have to re-install the Nvidia app.
I had to manually add MHW and then restart the app.
Then go to the MHW in the app and down to driver settings and set it there. Just choose 'latest'
Generated frames don't do anything to input latency, all it does is insert generated frames in between 2 native frames on the render pipeline on your display. If you're getting a native 60fps and 90fps with framegen, then the game will continue to run at 60fps natively and you will have the exact same input latency as you have on the game running at 60fps natively.
No input on fake frames is only an issue in games where individual frames and high reaction time matter, such as twitch shooters and MOBAs. On a game like Monster Hunter it doesn't matter that much.
And the lower you go the worse the input delay is.
You can literally google this stuff and find a instant answer to see your wrong.
And you literally contradict yourself in the same post.
Read the users Helmic post above they explain it perfectly.
No I am playing on a Samsung 65 inch Oled S90C
Alright I was giving the simplified non-nuanced version but if you want to get technical then I'll get technical with you.
Buckle the ♥♥♥♥ up.
There is a very slight end-to-end increase in input latency created by the fact that frame generation requires processing time to generate frames. This is a flat increase of about 10ms that gets added to the total processing time of every single step of the input-to-visible effect on screen. This coincides with about the same delay between frames on native resolution that generated frames sit between. You click your mouse, the input gets sent, the input is processed, the processed input is sent to the game, the game interprets the input, the interpreted input gets sent to the renderer, the game renders frames, your GPU generates a frame based off the native frames sent to it, the game renders more frames, you see these frames on your screen. This total process is what encompasses end-to-end input latency, but this tiny blip is virtually imperceptible to the overwhelmingly vast majority of people, and even with this slight increase in end-to-end latency, it's still less than half the average total latency that most consoles have (around 40-50ms total end to end latency on PC with frame gen vs 90ms on consoles based on DigitalFoundry testing, and this is fairly high latency as well, with a solid system and good user interface hardware you'll probably see significantly lower end-to-end latency).
So in summary, no, turning on framegen is not going to increase your input latency in any meaningful way outside of two scenarios: your native framerate is so low that the fake frames are on screen for long enough for inputs made on those frames to be visibly delayed to the next natively rendered frame, in which case your input hasnt had its delay increased, it's exactly as responsive as the natively rendered game at that framerate would be, or you are exceed your monitor's refresh rate, at which point input latency skyrockets because it's trying to buffer so many fake frames the system can't keep up.
If you're already getting good performance and are not hitting your monitor's refresh rate, then turn on Framegen. There is literally no downside to it. You will simply get a smoother visual experience. I can guarantee you that 10ms of increased end-to-end input latency is not something you will ever notice and you do not play at a professional competitive level in twitch shooters like CS or Valorant or MOBAs like DOTA2 or League of Legends high enough for that input delay to mean anything for you.
If you're getting ♥♥♥♥♥♥ performance, then all Framegen will do for you is give you more ♥♥♥♥♥♥ frames, so don't turn it on.
If you're at your monitor's refresh rate, then you literally cannot display a higher framerate anyway, so don't turn it on or it WILL increase your input latency by a noticeable amount.