Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Now after upgrading to an i7 11800, RTX 3080 the game runs flawless for me with 60 Fps on highest settings, especially when I am inside cities. Which is actually not a good sign. Games should be optimised to run on the stated official system requirements when they are released, otherwise what is even the point of releasing those requirements. The fact that graphics cards from 2021 are needed to run a 2017 game without any hiccups or performance problems speaks a lot about how much the game developers cared for optimising it.
I still love the game though so Ubisoft really did a good job there.
Ubisoft used to downgrade PC versions of their games, but luckily they don't do that anymore.
Origins is a good benchmark because it stresses both CPU and GPU, and I'm still using it to test my new PCs. Those 'recommended settings' usually are meant for mainstream hardware and displays rather than 4K at max settings. I guess, devs could limit available graphics settings depending on the hardware because apparently people have no idea what their hardware is capable of and are pushing it too hard. That's what Carmack did with RAGE in 2011, and PC gamers hated when game was optimizing its performance itself. That could solve 'performance issues', but it wouldn't work because gamers aren't just playing games, they also like to use games to benchmark their system's performance. And they hate it when they are actually able to push their PC to its limits, weirdly enough.
But to tell you the truth, according to how I experienced it, F.E.A.R. , Condemned and even Doom 3 did not gave me that hard of a time once I tweaked a little bit of settings. I was able to run most of these games on a mix of High-Ultra High without having a hard time. So I think there was some amount of optimisation involved in making those games. It all started with the release of Crysis when no amount of hardware at that point of time could actually run it. I think Crysis set the trend for games releasing and demanding much more of the hardware requirements than they were supposed to run on according to their official requirements. Then there were a lot of games which started to come which used to not run well on current hardware, even if they were the recommended ones.
But like you said, maybe all the RAGE hate from gamers for optimizing their performance ticked the developers off and they decided its much better for PC gamers to benchmark their settings on their own rather than by them. So its like a double edged sword, you got your benchmarking but the actual performance during gameplay suffers. During the times when I used to play games, benchmarking was not even a thing. You just looked up the game, saw whether it met the requirements, bought it, popped it in, tweaked some settings and played the hell out of it. Gaming has changed a lot since then, its more than just playing a game now.