Monster Hunter Wilds

Monster Hunter Wilds

View Stats:
168cm 83kg 13cm Feb 11 @ 4:32am
12
2
1
30 FPS should be your target and you'll never need more
I keep seeing people unironically whining about getting 40-60 FPS and calling it "bad performance". Well, let me tell you something: it’s NOT. Honestly, it’s embarrassing how spoiled gamers have become these days. Your precious AAA titles are running just fine, but here you are crying because you’re not hitting some arbitrary 100+ FPS threshold. Imagine being so entitled that you can’t enjoy a game unless your frame counter is stroking your ego.

Let me hit you with the atomic redpill - 30 FPS is the perfect sweet spot for gaming . It’s buttery smooth, cinematic, and honestly, anything more is just overkill. And let’s not forget: the human eye can barely detect anything above 30 FPS anyway. Anything higher is placebo at best, and at worst, it’s just inflating egos of gullible gamers who love throwing their money away for imaginary numbers in the corner of their screen.

I’ve got a high-end gaming rig that should be able to push 100+ fps in the majority of games I play. But instead, I use a frame limiter to cap everything at 30 FPS.
Why? Because I’m smart :yk2strength:
I don’t need my rig running at full power just to feed me unnecessary frame rates. 30 FPS gives me the optimal gaming experience and exactly how games were meant to be played.

30 FPS master race will always reign supreme. If you disagree, you’re probably one of those delusional people spending more time in Nvidia Control Panel than actually playing the game.

Cheers~
Last edited by 168cm 83kg 13cm; Feb 11 @ 5:06am
< >
Showing 16-30 of 101 comments
brothers and sisters, you are literally arguing over the MOST POINTLESS ♥♥♥♥♥♥♥ THING EVER, like let the man have his opinion dawg, i mean he is 168cm, 83kg, 13cm. You all are malding in your rooms, like let opinion be opinion. But for the sake of the argument i wanna chip in as well:

The human eye can see between 30 and 60 frames p. S. so much is true, even though some people may argue one can see more but further research needs to be done there.

I can see why some of you Soyjack malders need a high FPS count for games, but let’s be real—this is Monster Hunter we’re talking about. Not some game where you can bust out the Griddy as Goku, hit a 360 no-scope on Hatsune Miku, and then get absolutely clowned on by a 12-year-old squeaker yelling on top of his lungs "omg i hit a 360 noscope im so sigma skibidi rizz". The only time you need peak reaction speed in Monster Hunter is when you're cooking a steak and praying to the Elder Gods that you don’t overshoot into well-done territory.

Also, most of your beloved modern cinematic masterpieces are stuck at 24FPS to 30FPS, yet you don't see people malding over that. Meanwhile, my guy with the legendary height of 168cm, an awe-inspiring 83kg mass, and a truly benevolent 13cm has already reminded you that many prehistoric games were frame-capped—and guess what?
eb Feb 11 @ 6:39am 
reported for trolling, baiting, and clown farming
are you kidding ? this is a boss fighting combat RPG, not a cinematic story action adventure game like GOW
Half baked trolling...
There are 2 things to framerate. Smoothness and input latency.

In my own opinion (and probably most others too) smoothness reaches a milestone at 90 fps where the diashow effect is hardly noticeable anymore. Means you can't see single frames.
But for input latency I would say 144 fps are more preferable.

Surely 30 fps are better than nothing and if you play a game like monster hunter then input lag doesn't really matter and you can use framegen 30 => 120.
Edit:
In a professional view below 3 ms input latency is recommended so we can take 144 fps as a milestone with 2.4 ms
Last edited by Grumpy Giuseppe; Feb 11 @ 7:08am
BEEP! Feb 11 @ 7:07am 
Originally posted by Grumpy Giuseppe:
Half baked trolling...
There are 2 things to framerate. Smoothness and input latency.

In my own opinion (and probably most others too) smoothness reaches a milestone at 90 fps where the diashow effect is hardly noticeable anymore. Means you can't see single frames.
But for input latency I would say 144 fps are more preferable.

Surely 30 fps are better than nothing and if you play a game like monster hunter then input lag doesn't really matter and you can use framegen 30 => 120.
No due the less frames you have the more impute delay with Framegen so if you use Framegen at 30-40s it's feel like your playing in the 20s it's that bad that's the main reason why at min NVIDIA and AMD say use a stable 60 at min but preferably stable 70fps+.

So basically playing at a locked 30 or 40 will look better and play better than a unstable 30-40 with framegen.

Plus there's the whole how god awful the game will look with smearing and artifacts when you start to use framegen under 60fps and that benchmark doesn't even begin to show bad bad it can get because the screen isn't moving around in any quick manor during the benchmark.

I would honestly just recommend someone that can't get a stable 60fps if they hang around the 30-40s to just lock the game at 30fps or 40fps that way the game will look better and feel better and be smoother and stable over what Framegen can offers them at those frames.

Framegen is a cool tech but it's not built for frames under 60fps it's not a miracle tech after-all it needs frame to generate frame and the less frames it has to work the worse the impute delay and the worse it'l look.
Last edited by BEEP!; Feb 11 @ 7:10am
People don't buy $1000+ GPU to play with 30 fps, or even 60.
Fake generation is not helping at all, the game feel unresponsive and look like crap with such a low ammount of fps.
The real truth is that Capcom want to use this garbage tier engine in open worlds to save money and that was obviously a mistake.
Originally posted by 1080Puktra:
Originally posted by Grumpy Giuseppe:
Half baked trolling...
There are 2 things to framerate. Smoothness and input latency.

In my own opinion (and probably most others too) smoothness reaches a milestone at 90 fps where the diashow effect is hardly noticeable anymore. Means you can't see single frames.
But for input latency I would say 144 fps are more preferable.

Surely 30 fps are better than nothing and if you play a game like monster hunter then input lag doesn't really matter and you can use framegen 30 => 120.
No due the less frames you have the more impute delay with Framegen so if you use Framegen at 30-40s it's feel like your playing in the 20s it's that bad that's the main reason why at min NVIDIA and AMD say use a stable 60 at min but preferably stable 70fps+.

So basically playing at a locked 30 or 40 will look better and play better than a unstable 30-40 with framegen.

Plus there's the whole how god awful the game will look with smearing and artifacts when you start to use framegen under 60fps and that benchmark doesn't even begin to show bad bad it can get because the screen isn't moving around in any quick manor during the benchmark.

I would honestly just recommend someone that can't get a stable 60fps if they hang around the 30-40s to just lock the game at 30fps or 40fps that way the game will look better and feel better and be smoother ans table over what Framegen can offer them at those frame.

Framegen is a cool tech but it's not built for frames under 60fps it's not a miracle tech after-all.

Yeah you are not wrong but the slow combat in MH and the controller acceleration makes the input delay very bearable and I am ok with a few artifacts if I can get at least 60 fps.
Lossless Scaling 3.0 did a good job there but AMDs frame gen was awful.
Originally posted by Mechanique:
Originally posted by Necropants:
Dumbest post I have ever seen on this board and thats quite the achievement.

no 30fps is not "cinematic" most games were 60 fps back in the day btw.
30fps is choppy, basically causes some of us motion sickness is some engines and worst of all creates input latency which has a tangible negative effect on gameplay.

I'm personally happy with 90fps in games but anything lower than 60fps is not a good experience in 2025

My guy is so FPSpilled he gaslit himself into thinking he can see difference between 60 and 90 fps

Also, all games were 30 fps for decades and it did not impact the games quality in the slightest. 60 fps is cool and all, but ultimately unnecessary

I have been trying to resist but you have been suggesting high FPS is not perceptible or perhaps useful in many discussions now and this is simply wrong.

Hardware unboxed: A very easy to consume video on the topic
https://www.youtube.com/watch?v=OV7EMnkTsYA&pp=ygUpaGFyZHdhcmUgdW5ib3hlZCBoaWdoZXIgZnJhbWVyYXRlIG1hdHRlcnM%3D

A "slightly" more science based video. The portions about rapid and inconsistent changes in what we see is important to understand.
https://www.youtube.com/watch?v=FhSHeYT2U70

The clarity of motion that a higher FPS can provide is easily discernible from 60 to 120 FPS. The ability to discern abrupt changes such as view changes, explosions, enemies peeking out, etc. is MUCH higher than how fast we can perceive EVERYTHING in front of our eyes. A higher FPS makes perceiving changes at many different levels and in many different forms not only easier but in many cases possible at all.

Our eyes and minds are not like cameras and do not rely on FPS. Higher FPS does help us to see more clearly though. Yes, there is a point where our perceptions and quite literally mental hacks simply cannot keep up. The limit is different depending on WHAT we are looking at and HOW we are looking at it but getting into that would require an even larger essay than this post already is.

Input latency is another factor and this has nothing to do with visuals. Another way to look at FPS is how fast can the game simulation is running. The faster the simulation can run the faster your inputs into the game can be processed. Input lag affects how well you can play the game from attacking, defending, and co-coordinating with other players. A player who is lagging in a co-op game is definitely noticeable.

60 FPS or 120 FPS is nowhere close to the limit of human perception with regard to clarity and rapid changes to objects in our field of view. If humans only perceived an entire frame or nothing at all then yes 60 FPS would absolutely overwhelm both our senses and perception but how we perceive the world around us simply does not work like this.

--------------------------

I am not saying we need 240 FPS in order to play MH:Wilds. I am definitely saying 60 FPS or 90 FPS is nowhere near the limit of what can be perceived, useful, or enjoyable.

Saying that 30 FPS did not impact the quality of games on many levels in the past is objectively false. By no measure or contortion does this even approach being true.

30 FPS was acceptable only because nothing was better or could be made available at the time.

This is like saying an abacus dumpsters your cell phone...
Last edited by Simulacrum111; Feb 11 @ 7:29am
if u blink more 24 fps is playable
PROX Feb 11 @ 7:25am 
Originally posted by Rayleeigh:

Also, most of your beloved modern cinematic masterpieces are stuck at 24FPS to 30FPS
too bad we are playing video games and not watching movies.
Last edited by PROX; Feb 11 @ 7:26am
Originally posted by uwu:
if u blink more 24 fps is playable

I admit that made me laugh :)
Originally posted by Simulacrum111:
*snip*
Overall great post, but I'd like to add two things.

1) 30 FPS is absolutely playable for 99% of the population. The higher the FPS the smoother and more "clean" (can't think of another word to describe it) it looks, but anyone claiming that 30 FPS is "stuttery" or "choppy" is on that good ♥♥♥♥ (I'd ask for some but my company works with the USG and I don't want to lose my job.) 30 FPS is not as smooth as 60 FPS, but it's only when you get into the 23 FPS and below (or really any FPS that starts tearing because of desync with your monitors refresh rate) that stutter is an issue.

And 2) REFRESH RATE! If your monitor only has 60hz Refresh Rate then getting 240+ FPS won't look any different than 60 FPS and honestly may look worse. I have a 21:9 Ultrawide 60hz monitor, there's literally no reason for me to every want to push past 60 as my monitor would then be my limiting factor. What's great though is that 30 FPS is a fraction of 60, 90, and 120 FPS, so if you aim for 30 FPS as a baseline then you can have a non-tearing frame rate on even 60, 90, or 120 hz monitors. The 144hz monitors are a weird spot honestly and I'm not sure how they fit into this.
PROX Feb 11 @ 7:41am 
Originally posted by Scipo0419:
Originally posted by Simulacrum111:
*snip*
Overall great post, but I'd like to add two things.

1) 30 FPS is absolutely playable for 99% of the population. The higher the FPS the smoother and more "clean" (can't think of another word to describe it) it looks, but anyone claiming that 30 FPS is "stuttery" or "choppy" is on that good ♥♥♥♥ (I'd ask for some but my company works with the USG and I don't want to lose my job.) 30 FPS is not as smooth as 60 FPS, but it's only when you get into the 23 FPS and below (or really any FPS that starts tearing because of desync with your monitors refresh rate) that stutter is an issue.

And 2) REFRESH RATE! If your monitor only has 60hz Refresh Rate then getting 240+ FPS won't look any different than 60 FPS and honestly may look worse. I have a 21:9 Ultrawide 60hz monitor, there's literally no reason for me to every want to push past 60 as my monitor would then be my limiting factor. What's great though is that 30 FPS is a fraction of 60, 90, and 120 FPS, so if you aim for 30 FPS as a baseline then you can have a non-tearing frame rate on even 60, 90, or 120 hz monitors. The 144hz monitors are a weird spot honestly and I'm not sure how they fit into this.
People before discovering input latency be like:

On a serious note from my experience with 60hz getting 120 and higher fps felt better because of reduced latency. especially in action heavy games
Originally posted by PROX:
Originally posted by Scipo0419:
Overall great post, but I'd like to add two things.

1) 30 FPS is absolutely playable for 99% of the population. The higher the FPS the smoother and more "clean" (can't think of another word to describe it) it looks, but anyone claiming that 30 FPS is "stuttery" or "choppy" is on that good ♥♥♥♥ (I'd ask for some but my company works with the USG and I don't want to lose my job.) 30 FPS is not as smooth as 60 FPS, but it's only when you get into the 23 FPS and below (or really any FPS that starts tearing because of desync with your monitors refresh rate) that stutter is an issue.

And 2) REFRESH RATE! If your monitor only has 60hz Refresh Rate then getting 240+ FPS won't look any different than 60 FPS and honestly may look worse. I have a 21:9 Ultrawide 60hz monitor, there's literally no reason for me to every want to push past 60 as my monitor would then be my limiting factor. What's great though is that 30 FPS is a fraction of 60, 90, and 120 FPS, so if you aim for 30 FPS as a baseline then you can have a non-tearing frame rate on even 60, 90, or 120 hz monitors. The 144hz monitors are a weird spot honestly and I'm not sure how they fit into this.
People before discovering input latency be like:

On a serious note from my experience with 60hz getting 120 and higher fps felt better because of reduced latency. especially in action heavy games
The real difference is the facct that you're just above 60 FPS. 60-120 is not a very noticable difference when it comes to input. I dont know who thought you that...

Anything below 60 will become more and more noticable.

After 60, there's not much of a difference when it comes to input lag. Where liess the big difference? Competitive shooters, simply because you as an example see people come earlier around corners.

Thats why imho its okay to have 60 average to account for drops into 30-60.
Last edited by GamingWithSilvertail; Feb 11 @ 7:47am
BEEP! Feb 11 @ 7:50am 
Originally posted by GamingWithSilvertail:
Originally posted by PROX:
People before discovering input latency be like:

On a serious note from my experience with 60hz getting 120 and higher fps felt better because of reduced latency. especially in action heavy games
The real difference is the facct that you're just above 60 FPS. 60-120 is not a very noticable difference when it comes to input. I dont know who thought you that...

Anything below 60 will become more and more noticable.

After 60, there's not much of a difference when it comes to input lag. Where liess the big difference? Competitive shooters, simply because you as an example see people come earlier around corners.

Thats why imho its okay to have 60 average to account for drops into 30-60.
30-60 big difference 60-90 same big difference but once you start going past 120 it starts to degrade on how noticeable in a big way.
Me I can tell and feel what 30fps is what 60fps is and what 90fps is all the way to 120fps then everything after that it drops off sharply but still slightly feels better all the way into the 200s.
Last edited by BEEP!; Feb 11 @ 7:52am
< >
Showing 16-30 of 101 comments
Per page: 1530 50

Date Posted: Feb 11 @ 4:32am
Posts: 102