Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I challenge anyone to accurately determine what framerate they're watching/playing in a blind test, assuming they all stay above 30fps.
inb4 "how can you expect people to tell if they're blinddfolded?"
True this. I only notice when it sometimes shoots up from 50-60 FPS to 120+ FPS in some games, but that's about it. Not to mention that the human eye (and by extension, brain) are VERY ADAPTABLE, so once you play a game at a certain framerate, your visual perception "normalizes" (mental frame insertion kinda, lol), regardless of whether the framerate is 30, 45, 75 etc.
A few years ago I still had my own machine, and I remember buying Assassin's Creed: Origins. I had an Nvidia 980M back then, and once I cranked almost all settings to Ultra, Origins would run at an average rate of 45 FPS @ 1080p. I remember feeling horrified, because I was not hitting 60, and spent like a whole week trying to improve my performance somehow without sacrificing any visual fidelity.
Then I finally gave up, went like OH WELL, and actually started paying attention to THE GAME. And you know what? The gameplay still felt buttery smooth to me at all times.
So...
Correction: I have reasonable standards - my goal is to play a game comfortably. Lower than 60 FPS is completely fine in this sense. Heck, both next-gen consoles have most of the games running at locked 30 FPS, yet the vast, vast majority of people keep playing. So obviously, it's not a real problem.
You have irrational standards - your goal is to pump those FPS numbers up, not actually play the game. I think you should take a honest look in the mirror and self-reflect for a bit. "Do I actually want to play games?" is a good question to start with.
I play on a 240hz monitor and a 120hz 4K OLED TV.
I guess if you're spending time in graphically simple games that run on 144 FPS or above constantly, you'd notice a dip to below 90 (it will feel slightly less fast). But since I play Forza 5 on 4K Ultra (RTX on, DLSS Quality) at like, 60-90 FPS, for example, I only notice when it dips below 50. Even then though it looks and feels smooth, the noticeable choppiness for me begins when games hit below 25.
All the particle effects and damage ticks had the game chug for a solid few seconds, but was pretty smooth the rest of the fight and the game in general.
That being said, Boltgun runs butter smooth so I don't know why this thread exists. I guess to farm steam points or something.
60 has only become the "standard" in the last 2 years or so. While I WANT that to be the standard, it's not very realistic, especially for 4K and the most recent generations of GPUs or consoles that underperform consistently (not to mention the poor game optimizations or inherent game engine problems). The "60 FPS @ 4K everything on Ultra" dream is still not here, and I'm saying this as someone who bought a $2700 gaming laptop with an RTX 3080 last year.
You've been groomed by tech companies to believe this, but that does not make it a fact.
I remember when people were complaining about Order 1886, companies don't want to spend time or money properly optimizing their games, but will try to dazzle gamers who have low standards with pretty graphics. Having low standards is why you get trash like Redfall. Stop accepting trash.
BTW it's not contestable at all. Sub 60 is a slide show.