Far Cry 5

Far Cry 5

View Stats:
IcyRhythms Jun 26, 2018 @ 10:44am
This game needs to take advantage of more than 4 cores
The performance on a Ryzen 1700x and 980Ti just isn't acceptable. It's a beautiful game at times, but there are better looking games that are more demanding. The Division looks better. Witcher 3 looks better. Just naming a couple off the top of my head.
Last edited by IcyRhythms; Jun 26, 2018 @ 11:43am
< >
Showing 16-30 of 43 comments
IcyRhythms Jun 27, 2018 @ 5:13pm 
Originally posted by .n0 m3rCy:
GTA V is a very demanding game, and they update it constantly. -.- For a 6 year old game it's constantly getting patched and improved. My 670's ran it on full graphics, until about a year ago, then they patched in more contents and upgrades to graphics to support newer cards etc.

Exactly. GTA is old, but the PC version is only three years old. It’s still used in almost every benchmark. It makes great use of many threads/cores as well. That’s my point. I can stream that game and play it, both at 60FPS with no slow down in sight.
Vacant Zaay Jun 28, 2018 @ 6:18am 
Ya you guys need to check your drivers or refresh you system. Im on a laptop with a i7 6700HQ and a 980m and i can play on ultra, everything enabled plus extra AA in nvidia profile inspector and i never ever see below 45fps and its almost always 55-60. I also have throttlestop limiting turbo boost to only 2.8ghz instead of the full 3.3ghz
Piano man Jun 28, 2018 @ 6:25am 
Originally posted by .n0 m3rCy:
My 670's ran it on full graphics, until about a year ago, then they patched in more contents and upgrades to graphics to support newer cards etc.

I'm a huge fan of GTA V... but in what ways did they upgrade the graphics in the game? I have seen NO evidence of that - and I play I've played it through about half a dozen times since it was released.

Yes, they add in new models, places, guns etc... but I haven't seen any 'update' in the graphics. Fixing performance issues is one thing - updating the graphics is another.

As I said, not trying to pick a fight - but I just want to clarify...
Last edited by Piano man; Jun 28, 2018 @ 6:25am
IcyRhythms Jun 28, 2018 @ 10:09am 
Originally posted by suitercrew:
Ya you guys need to check your drivers or refresh you system. Im on a laptop with a i7 6700HQ and a 980m and i can play on ultra, everything enabled plus extra AA in nvidia profile inspector and i never ever see below 45fps and its almost always 55-60. I also have throttlestop limiting turbo boost to only 2.8ghz instead of the full 3.3ghz

I know that many love to always ask/tell those with the issues to make sure they're on the latest drivers and apps are all up to date, but this has to stop.

You can't assume everyone with an issue hasn't already done this and taken other basic troubleshooting steps.

Motherboard and GPU have all the updates they can currently receive on my end.

Someone else mentioned just because person A thinks game X looks better than other titles doesn't mean more is coming down the hardware pipeline than game y. That's true. I do think in this case an exception can be made. There is just far more happening in a game like GTA or The Division and they look A LOT better Far Cry 5. This isn't to say it looks bad. It looks beautiful at times. My favorite looking Far Cry game by far. I just think there is more going on here than "hey! check your drivers. there is more coming down the hardware pipeline in a game like Far Cry than some of the other games you mentioned".

There has to be more to it than that, and I as I stated, I think there are probably just more well optimized games than Far Cry 5. It isn't the worst we've seen. It's not even the worst we've seen from Ubi, but sometimes games just aren't optimized all that well.
firestorm Jun 29, 2018 @ 9:13am 
Game does take advantage of more than 4 cores/8 threads, dependent on settings.
Game also uses at or near 100% utilization of 2 1080ti's in sli at 1920x1080 @60fps!
again, dependent on settings.
(10core/20threads with games settings at ultra with resolution scale at 2 for the above results)

This may/may not help you?
First off,
Attempting anything more than 1920x1080 with max settings is no more than a pipe dream with this game (2 1080ti's running near full tilt at 1920x1080 is proof enough of that for me!)

Reduce the resolution scale under advanced settings,
default is 1 max is 2 nfi what the minimum is atm. (setting of 2 uses roughly 4gig of vram)

My end (may be different for you?) vsync works better than the in game frame limitor.
Try both......Then off to the nvidia control panel & try its vsync/frame limitor..
Use the one that gives the best preformance for your setup.

Yep gonna get negitive feedback on this one, but its actually true....
Over 60 frames per second is pointless!
For the people who say otherwise. Prior to typing it, answer this simple little question first.
Can you see the flicker of an incandescent light bulb running at 60hrz?
99.999999% of humans cant, so over 60fps = pointless for normal humans!
Yes btw they do flicker (shut off/turn on) 60 times per second! (a fun filled fact for ya)
If you answered yes than you are not a normal human, so no need to point that out.. lol

Have also noticed around 6.8 gig of ram being used while in this game.
So assuming all max settings (including resolution scale) 8gig is bairly enough, add!

Check those page file settings, I've noticed mine hitting aprox 17gig while ingame.
If not already, place it on your fastest drive and/or one that dont also contain the game.
Better yet, use a 32gbps PCIe x4 NVMe M.2/U.2 if possible (5x + faster than SATA)

Play with those ingame/windows/nvidia settings until you find that sweet spot for your setup..
Unoptimized? Debatable at best, I've never seen a prefectly optimized game.
But this game does do a very stable 60fps for me! With 100++fps being possible..

As of writing this, aside from GTA V which can use up to 48gig of vram (frame scale to its max)
Which isnt even plausible with gamer level gpu's currently available.
Its the only game I've seen thus far that pushes that pair of 1080ti's to thier limits.
(yes, stock clock's) Games rather brutal on 'em! Witcher 3 dont even come close.

Just so ya know, bought the game directly from Ubi..
IcyRhythms Jun 29, 2018 @ 2:35pm 
Originally posted by firestorm:
Game does take advantage of more than 4 cores/8 threads, dependent on settings.
Game also uses at or near 100% utilization of 2 1080ti's in sli at 1920x1080 @60fps!
again, dependent on settings.
(10core/20threads with games settings at ultra with resolution scale at 2 for the above results)

This may/may not help you?
First off,
Attempting anything more than 1920x1080 with max settings is no more than a pipe dream with this game (2 1080ti's running near full tilt at 1920x1080 is proof enough of that for me!)

Reduce the resolution scale under advanced settings,
default is 1 max is 2 nfi what the minimum is atm. (setting of 2 uses roughly 4gig of vram)

My end (may be different for you?) vsync works better than the in game frame limitor.
Try both......Then off to the nvidia control panel & try its vsync/frame limitor..
Use the one that gives the best preformance for your setup.

Yep gonna get negitive feedback on this one, but its actually true....
Over 60 frames per second is pointless!
For the people who say otherwise. Prior to typing it, answer this simple little question first.
Can you see the flicker of an incandescent light bulb running at 60hrz?
99.999999% of humans cant, so over 60fps = pointless for normal humans!
Yes btw they do flicker (shut off/turn on) 60 times per second! (a fun filled fact for ya)
If you answered yes than you are not a normal human, so no need to point that out.. lol

Have also noticed around 6.8 gig of ram being used while in this game.
So assuming all max settings (including resolution scale) 8gig is bairly enough, add!

Check those page file settings, I've noticed mine hitting aprox 17gig while ingame.
If not already, place it on your fastest drive and/or one that dont also contain the game.
Better yet, use a 32gbps PCIe x4 NVMe M.2/U.2 if possible (5x + faster than SATA)

Play with those ingame/windows/nvidia settings until you find that sweet spot for your setup..
Unoptimized? Debatable at best, I've never seen a prefectly optimized game.
But this game does do a very stable 60fps for me! With 100++fps being possible..

As of writing this, aside from GTA V which can use up to 48gig of vram (frame scale to its max)
Which isnt even plausible with gamer level gpu's currently available.
Its the only game I've seen thus far that pushes that pair of 1080ti's to thier limits.
(yes, stock clock's) Games rather brutal on 'em! Witcher 3 dont even come close.

Just so ya know, bought the game directly from Ubi..

I could name a few perfectly well optimized games. I guess about as perfect as you could ask for. Left 4 Dead games, other Valve games, GTA V, Rainbow Six Siege, and DOOM all come to mind. Mad Max was excellent.
FeilDOW Jun 30, 2018 @ 7:29am 
Originally posted by firestorm:
Game does take advantage of more than 4 cores/8 threads, dependent on settings.
Game also uses at or near 100% utilization of 2 1080ti's in sli at 1920x1080 @60fps!
again, dependent on settings.
(10core/20threads with games settings at ultra with resolution scale at 2 for the above results)

This may/may not help you?
First off,
Attempting anything more than 1920x1080 with max settings is no more than a pipe dream with this game (2 1080ti's running near full tilt at 1920x1080 is proof enough of that for me!)
What the hell are you taling about? A stock 8700k with sli 1080 ti's will be at 40% utiliztion each, and I play at 4k with max settings at 60fps with drops to 54fps.

Ah I see now you're not at 1080p with the res scale at 2 you are actually rendering at 4k and in that case yes my 1080 ti's will be in the 90-100% usage and only 1 is needed.
Last edited by FeilDOW; Jun 30, 2018 @ 7:30am
Vacant Zaay Jul 1, 2018 @ 7:32am 
It doesnt make sense how i can play on Ultra and mostly see 60fps with a single 980m and a laptop cpu, but you guys are having problems.

The only thing odd i have noticed with the game which firestorm mentioned he had changed was the screen scale. On a game like Pubg i can max the screen scale to 120 so 20% increase and lose 5fps, but if i tried to do that in FarCry and set screen scale to 1.2 Ill lose an insane 24fps on average. My 980m is the 8gb vram varriant so its not like im running out of vram.
Kaldaien Jul 1, 2018 @ 9:06am 
Originally posted by IcyRhyhtms:
Originally posted by Secret Squirrel:
What leads you to believe Far Cry 5 does not leverage all available cores? Because my own profiling of this game evidences that it does, as do the findings in this testing: https://www.tomshardware.com/reviews/far-cry-5-performance-benchmark-ultra,5552-6.html

The performance says otherwise. Gettin little to no utilization on most cores/threads on my end.
Which just goes to show how low overhead the graphics engine is.

In fact, it's so well optimized that there's literally a thread that never sleeps and this constitutes about 40% of all measured CPU time in the game. That thread just spins a loop adding values (it serves no purpose). You can suspend said thread and your CPU usage will go even lower.

https://www.youtube.com/watch?v=WfdhG-LDXwA

Way more detailed thread analysis than you're ever going to find probably anywhere on this engine -- you're welcome :)
Last edited by Kaldaien; Jul 1, 2018 @ 9:07am
Secret Squirrel Jul 1, 2018 @ 10:41am 
I think it has been well demonstrated that the claim implied by the subject of this thread is a false claim. There is nothing about this game that makes it not leverage all available cores/threads, or not behave as expected on a Ryzen CPU platform. It is not a claim based on measurable evidence, but based on comparison to other titles which are older and designed for less GPU power, but they "look better".

As part of the confirmation I have done, I have seen it happen on one system where Windows 10 task manager performance view does not accurately show GPU usage. But other tools which have long been the "go to" tools for gamers and tweakers do accurately show GPU usage, and using those tools one can see clearly that GPU is the most significant factor in frame rate. These tools include HWInfo, OCCT, GPU-Z to name a few.

The OP's main complaint is that when streaming/recording the frame rate drops significantly. Hard to say what causes this, but I would lean more toward the capture app being inefficient, or placing load on the GPU.

Using NVidia GeForce Experience, I was able to verify that streaming/recording with that does place the encoding workload on the GPU, and while doing this I saw a frame rate drop of about 10%. But disabling stream/record in GeForce Experience and using Open Broadcast Studio (OBS) and using "software" encoding, the encode workload is not placed on the GPU, but there is a negligible FPS drop of about 1-2 fps. Its hard to imagine hooking into video ouput not having any fps drop.

Regarding the light bulb 60hz, you can't see a light bulb which is running on AC flicker because the light bulb filament takes longer to extinguish than 1/60th of a second. The light is the result of heat, which doesn't immediately go away as power alternates at 60hz.

In the context of gaming, this is why it isn't terribly effective to have a rig that can do 100fps outputting to a monitor that has a slow response time.
IcyRhythms Jul 1, 2018 @ 11:04am 
Originally posted by Secret Squirrel:
I think it has been well demonstrated that the claim implied by the subject of this thread is a false claim. There is nothing about this game that makes it not leverage all available cores/threads, or not behave as expected on a Ryzen CPU platform. It is not a claim based on measurable evidence, but based on comparison to other titles which are older and designed for less GPU power, but they "look better".

As part of the confirmation I have done, I have seen it happen on one system where Windows 10 task manager performance view does not accurately show GPU usage. But other tools which have long been the "go to" tools for gamers and tweakers do accurately show GPU usage, and using those tools one can see clearly that GPU is the most significant factor in frame rate. These tools include HWInfo, OCCT, GPU-Z to name a few.

The OP's main complaint is that when streaming/recording the frame rate drops significantly. Hard to say what causes this, but I would lean more toward the capture app being inefficient, or placing load on the GPU.

Using NVidia GeForce Experience, I was able to verify that streaming/recording with that does place the encoding workload on the GPU, and while doing this I saw a frame rate drop of about 10%. But disabling stream/record in GeForce Experience and using Open Broadcast Studio (OBS) and using "software" encoding, the encode workload is not placed on the GPU, but there is a negligible FPS drop of about 1-2 fps. Its hard to imagine hooking into video ouput not having any fps drop.

Regarding the light bulb 60hz, you can't see a light bulb which is running on AC flicker because the light bulb filament takes longer to extinguish than 1/60th of a second. The light is the result of heat, which doesn't immediately go away as power alternates at 60hz.

In the context of gaming, this is why it isn't terribly effective to have a rig that can do 100fps outputting to a monitor that has a slow response time.

I feel like you're accusing me of lying. I'm getting about 30-40% of total CPU usage while playing this game. If I record(WITH THE CPU IN OBS NOT THE NVENC in OBS or Shadowplay)or stream while playing, this shouldn't be much of a problem. If they CPU usage only kicks up to about 55-60% when recording, there is A LOT of headroom for the CPU to do anything it wants to in. Why such slow down, then? I'm still trying to figure that out lol

The Division is more demanding on the CPU, and I can record and stream at 30FPS or 60FPS with little to no slow down. The Division actually does take advantage of more than 4 cores, pretty effectively I may add.
IcyRhythms Jul 1, 2018 @ 11:34am 
https://www.reddit.com/r/Amd/comments/8apajy/far_cry_5_is_too_much_cpu_limited/

There is a thread with many discussing that this engine has never really utilized more than 4 core very well, and OC'ing the CPU(because the engine is also a little dependent on speed of cores)can help as well.

I don't OC, so, I guess I'm screwed on that front. Seems it's just a Ubisoft/engine problem and has been for some time.
Last edited by IcyRhythms; Jul 1, 2018 @ 11:42am
IcyRhythms Jul 1, 2018 @ 12:24pm 
Originally posted by Ironically Bad Player:
idunno man, I originally ran this on a i5-2500k on a ♥♥♥♥♥♥ old board I couldn't oc with and I managed to maintain at least mid to high 40s in frames. Now I am running it on a Ryzen+ 2700x and when I was viewing thread usage in osd (I don't now, becaus eof space it consumes onscreen but did at first on some game sout curiousity) I saw a very even spread among all the threads with Far Cry 5. I held 60 fps continuously and leveraged the power of my 1070Ti fully because of that. The difference between FC5's spread of the load, and somehting like eso (even up until this very day mind you) or the defiance repuke where the two literally only stress one core is like the difference between night and day both in stable framerates and visibly onscreen while playing.

I was also streaming (obs x264 cpu medium setting) the vietnam pack a couple of times, once for about 5 hours and the other for around 2 hours with no performance issues. there may be something wrong with your set up, maybe you need some motherboard packagages installed (I installed the chipset utilities and onboard wifi/bt from my own along with the audio and just made sure the wifi after a win update broke it and the chipset packages were up to date recently) or maybe you need some windows or driver updates idk. It says optimized for Ryzen right when you're starting the game up, and the rtss evidence shows that to be true. It ran well on my old sandy bridge from intel, and runs well on my new Zen+ from amd.

there has gotta be something on your end installed improperly or that needs to be installed, a runtime or something maybe. who knows. maybe change your power plan around and see if that makes a difference with how the game uses the cores on your end. comparing to older games never helps much, witcher 3 was made to run on hardware existing 3 or 4 years ago by now. of course it's gonna run better for everybody now. division shouldn't be compared to anything. ever. not even itself it's so bad. every time somebody calls up the division for comparison my brain tunes them out as either trolling or nonsensical. like the guy complaining on the hellbound boards the other day, referring to a 560 as a high end gpu and using dying light (a 4 year old game by now I think) and the division as reasons why a game release din 2018 should allow his mid range card to run on full ultra settings at full resolution at steady 60 fps and he refused to lower them thinking he shouldn't have to and refused to troubleshoot on his end as well.

you have one of the best cpus around to run this game on because it spreads the load so well. if it isn't doing that then something could possibly be wrong on your end. my experiences with win 10 have been full of ♥♥♥♥♥♥♥ messes ever since taking the last day of the free upgrade ont he old sandy bridge in january and especially after the new ryzen build once I went to build 1803 to be able to use my wmr headset. for all you know, you might be missing one single update from windows that could be causing some of your problems, it's happened ot me already. no ♥♥♥♥♥♥♥♥. win 10 breaks something just about every other week. courtesy of "we know what's best for your computer, now just lay back ease your sphincter muscles and let us do this gently while reboot your pc afterward and you can forget all about it". most creepy ass OS messages ever while they hack onto your system and break your install (but they're not criminals since they made the os).

https://www.reddit.com/r/Amd/comments/8apajy/far_cry_5_is_too_much_cpu_limited/

Please read through that thread a little bit. This engine is known to not do well beyond 4 cores, and can sometimes be dependent on speed of cores/threads.

This is(for the most part)a very clean install of Windows. I have all possible updates. I've ran the update check many times in the last couple of weeks.

People always assume because they're not having any issues that it MUST be something with the setup of the other person, and nothing wrong with the game.

Do me a favor. Try RECORDING(not streaming)at about 16000 CBR while playing, and then see if you get any major drops in performance.

edit, I record at 30FPS as well. At least I have been as of late.

Last edited by IcyRhythms; Jul 1, 2018 @ 12:27pm
Kaldaien Jul 1, 2018 @ 2:13pm 
It's been discussed to death in my review of this game.

It uses D3D11 Immediate Contexts for rendering. You will always have a single thread (the one controlling the swapchain) doing most of the work.

The opposite extreme is Assassin's Creed: Origins. It spawns 8 deferred contexts and batches rendering commands across 8 threads. No matter how MANY CPU cores you have. AC:O is BAD, BAD, BAD. It does not scale properly and you have a bunch of lunatics insisting Denuvo is the reason, when in fact, it just needs a @#$%load more CPU cores than most people have.
Last edited by Kaldaien; Jul 1, 2018 @ 2:15pm
IcyRhythms Jul 1, 2018 @ 2:35pm 
Originally posted by Kaldaien:
It's been discussed to death in my review of this game.

It uses D3D11 Immediate Contexts for rendering. You will always have a single thread (the one controlling the swapchain) doing most of the work.

The opposite extreme is Assassin's Creed: Origins. It spawns 8 deferred contexts and batches rendering commands across 8 threads. No matter how MANY CPU cores you have. AC:O is BAD, BAD, BAD. It does not scale properly and you have a bunch of lunatics insisting Denuvo is the reason, when in fact, it just needs a @#$%load more CPU cores than most people have.

Yeah. Those that keep blaming Denuvo for performance issues in games haven't done enough research. DOOM also had it and ran like a dream. Once they took it out, there was little to no benefit. Some also blamed RIME's issues on Denuvo. Developer admitted their water shaders were very resource hungry and even they struggled to understand why. Not sure if that was ever resolved.
< >
Showing 16-30 of 43 comments
Per page: 1530 50

Date Posted: Jun 26, 2018 @ 10:44am
Posts: 43