Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The only thing I've seen in Origins is that at times when you first start, it'll be very choppy. Hard to pin down the cause after trying many different nvidia control panel settings and in game settings. The last Origins session ran smooth.
Odyssey will occasionally stutter as well, but overall very smooth. Just bring the clouds down to high or medium, shadows down off max and see how it runs. Should be decent.
Having such rig is clear you will be able to run any game !
If I did not know better, I would think you are console person who just got in to PC, and have no idea about it !
Does any one here read game requirements ?!
.. I just can't play games at 30fps anymore. Gives me a headache and the lag is too noticeable. Yet it might be ok at 1080p due to a smaller screen size.
fwiw I think the OPs Question was legit. His system's spec is upper level, but so many titles these days are just buggy. Always worth checking the forums imho.
You really had to ask if a i7-8700 and rtx 2070 can run the game..... really..
But i feel someone need to tell him though.
And ofc he plays with 1080p.....
https://support.ubi.com/en-us/cases/new
With all due respect bro, I just think you're jumping the gun a bit here. It's extremely difficult to Origins to run at a stable 60 FPS even on the lowest settings in 1080p. Check out these two Origin benchmark screenshots from my two gaming rigs.
https://steamcommunity.com/sharedfiles/filedetails/?id=1354454295
High Settings Preset
2560x1440 @ 60 Hz
i7 2600k 3.4 GHz @ 4.6 GHz
8 GB DDR3 1333 MHz @ 2133 MHz
11 GB GTX 1080 Ti Super Clocked Black Edition
https://steamcommunity.com/sharedfiles/filedetails/?id=1709564018
Ultra Settings Preset
3840x1600 @ 60 Hz target with 40-75 Hz Freesync enabled
i9 9900k 3.6 GHz @ 4.9 GHz
32 GB DDR4 3200 MHz @ 16-18-18-36 CR 1
11 GB GTX 1080 Ti Super Clocked Black Edition
Less tech savvy folk would look at that and only notice a few differences:
A) The i7 2600k rig could handle High in 2560x1440 at 65 FPS
B) The i9 9900k rig could handle Ultra in 3840x1440 at 68 FPS
But that isn't true. Despite the benchmark scores being insanely close, B rig could actually run perfectly stable while A rig couldn't. I can wander through Alexandria on foot or horseback on B rig in ultra settings without a single visible stutter (if there are any it's masked completely by Freesync). But A rig… It was absolutely impossible to run 60 FPS through Alexandria in even the LOWEST settings in 1920x1080 (!!!).
So some may be wondering "How the heck does the score on rig A almost match the score on rig B (with the only differences being resolution and graphic presets) if A rig ran like crap? Well, the scores don't even come closing to matching each other! Think about this carefully. Is 120 frames every 2 seconds equivalent to 60 frames every 1 second? Is 30 frames every 0.5 seconds equivalent to 60 frames every 1 second? Think hard… What happens at 120 frames every 2 seconds when the first second renders 65 frames and the next second renders 55 frames? You're still getting 120 frames every 2 seconds right? Now I'm assuming anyone reading this has enough common sense to know that is NOT an ideal scenario and would make the game perform like crap. It's all about frame TIMES, not frame RATES. It's not just about 60 frames per second, it's about how evenly those 60 frames are spread out over that 1 second time line. To get a stable 60 FPS, you need to keep your frame times as close as possible to 16.667ms per frame. This was not possible under ANY settings or ANY resolution on my i7 2600k rig, because although the rig had the CPU clock speed to deliver 16.667ms frames, and had a GPU that could deliver 16.667ms frames, it didn't have enough system memory (RAM) bandwidth to consistently deliver those frames between the hardware components. Prior to overclocking the RAM on my old rig, the exact same settings as you see in the first screenshot only reached somewhere in the neighborhood of a 5000 benchmark score (proving the memory was the bottleneck).
So how did I play Origins on my old rig? I played it on ultra presets (yes, ultra) in 2560x1440 with a resolution modifier of 160% (equivalent to 4k+) and capped it at 30 FPS (a 33.3ms frame time target). You see, the frame times were so inconsistent on the old rig that not even 45 FPS could be perfectly stable in Alexandria.
https://steamcommunity.com/sharedfiles/filedetails/?id=1397488162
The screenshot above is how I played Origins on my old rig. IGNORE the red text! All it means is the game didn't know I had half-refresh-rate vsync forced through the Nvidia control panel. Notice how flat the first line is? Each "bump" is a frame time fluctuation that isn't visible to the human eye at that low of a frame rate (30 FPS), so the entire benchmark test had a near perfect 33.3ms frame time delivery from start to finish. Once you have perfect frame times at 30 FPS, a PC game will actually look better then a console game (which almost always run at 30 FPS), because not even consoles can deliver a 33.3ms frame time that consistently.
The reason B rig isn't flat lining the frame time is because it doesn't need to with adaptive sync (i.e. Freesync/Gsync) enabled.
So back to the topic at hand – the OP had a perfectly legitimate question.