Assassin's Creed Origins

Assassin's Creed Origins

View Stats:
Kleerex Jan 30, 2022 @ 2:57am
I can finally play this game at 4K with rock-solid 60 FPS!
Yes, you read the title right! Remember all those people sporting super expensive rigs with two GTX 1080ti and heaps of RAM still not being able to run the game in 4K back in 2017? Well, now they can!

My specs:

i5-12600K CPU
RTX 3070 GPU
16 GB DDR4 RAM at 3400 MHz

Nothing fancy, as you can see. And I didn't need that kidney anyway.
< >
Showing 1-3 of 3 comments
sunny29990 Jan 30, 2022 @ 4:26am 
This game is horribly optimised to run at the official recommended system requirements on which it states it should run perfectly. Even I had issues with low FPS running this game on my i7 7700, GTX 1070 and 32 GB Ram. The performance in cities like Alexandria, Memphis to name a few was awful.

Now after upgrading to an i7 11800, RTX 3080 the game runs flawless for me with 60 Fps on highest settings, especially when I am inside cities. Which is actually not a good sign. Games should be optimised to run on the stated official system requirements when they are released, otherwise what is even the point of releasing those requirements. The fact that graphics cards from 2021 are needed to run a 2017 game without any hiccups or performance problems speaks a lot about how much the game developers cared for optimising it.

I still love the game though so Ubisoft really did a good job there.
Last edited by sunny29990; Jan 30, 2022 @ 4:27am
Kleerex Jan 30, 2022 @ 6:56am 
I was half-joking in that original post. Developers rarely make games that could run at max settings on the hardware available at game's launch. For example, back in 2004 running Doom 3 at Ultra settings required a GPU with at least 512 MB VRAM, such consumer-grade graphics cards simply didn't exist back then. No one could run Crysis at max settings at launch.

Ubisoft used to downgrade PC versions of their games, but luckily they don't do that anymore.

Origins is a good benchmark because it stresses both CPU and GPU, and I'm still using it to test my new PCs. Those 'recommended settings' usually are meant for mainstream hardware and displays rather than 4K at max settings. I guess, devs could limit available graphics settings depending on the hardware because apparently people have no idea what their hardware is capable of and are pushing it too hard. That's what Carmack did with RAGE in 2011, and PC gamers hated when game was optimizing its performance itself. That could solve 'performance issues', but it wouldn't work because gamers aren't just playing games, they also like to use games to benchmark their system's performance. And they hate it when they are actually able to push their PC to its limits, weirdly enough.
sunny29990 Jan 30, 2022 @ 8:16pm 
Lol :) I guess you are right about that part. Doom 3, F.E.A.R., Condemned-Criminal Origins and the famous Crysis. I remember all of these games gave me a hard time to get to achieve maximum settings on my specs back at that time.

But to tell you the truth, according to how I experienced it, F.E.A.R. , Condemned and even Doom 3 did not gave me that hard of a time once I tweaked a little bit of settings. I was able to run most of these games on a mix of High-Ultra High without having a hard time. So I think there was some amount of optimisation involved in making those games. It all started with the release of Crysis when no amount of hardware at that point of time could actually run it. I think Crysis set the trend for games releasing and demanding much more of the hardware requirements than they were supposed to run on according to their official requirements. Then there were a lot of games which started to come which used to not run well on current hardware, even if they were the recommended ones.

But like you said, maybe all the RAGE hate from gamers for optimizing their performance ticked the developers off and they decided its much better for PC gamers to benchmark their settings on their own rather than by them. So its like a double edged sword, you got your benchmarking but the actual performance during gameplay suffers. During the times when I used to play games, benchmarking was not even a thing. You just looked up the game, saw whether it met the requirements, bought it, popped it in, tweaked some settings and played the hell out of it. Gaming has changed a lot since then, its more than just playing a game now.
< >
Showing 1-3 of 3 comments
Per page: 1530 50

Date Posted: Jan 30, 2022 @ 2:57am
Posts: 3