Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Exactly. GTA is old, but the PC version is only three years old. It’s still used in almost every benchmark. It makes great use of many threads/cores as well. That’s my point. I can stream that game and play it, both at 60FPS with no slow down in sight.
I'm a huge fan of GTA V... but in what ways did they upgrade the graphics in the game? I have seen NO evidence of that - and I play I've played it through about half a dozen times since it was released.
Yes, they add in new models, places, guns etc... but I haven't seen any 'update' in the graphics. Fixing performance issues is one thing - updating the graphics is another.
As I said, not trying to pick a fight - but I just want to clarify...
I know that many love to always ask/tell those with the issues to make sure they're on the latest drivers and apps are all up to date, but this has to stop.
You can't assume everyone with an issue hasn't already done this and taken other basic troubleshooting steps.
Motherboard and GPU have all the updates they can currently receive on my end.
Someone else mentioned just because person A thinks game X looks better than other titles doesn't mean more is coming down the hardware pipeline than game y. That's true. I do think in this case an exception can be made. There is just far more happening in a game like GTA or The Division and they look A LOT better Far Cry 5. This isn't to say it looks bad. It looks beautiful at times. My favorite looking Far Cry game by far. I just think there is more going on here than "hey! check your drivers. there is more coming down the hardware pipeline in a game like Far Cry than some of the other games you mentioned".
There has to be more to it than that, and I as I stated, I think there are probably just more well optimized games than Far Cry 5. It isn't the worst we've seen. It's not even the worst we've seen from Ubi, but sometimes games just aren't optimized all that well.
Game also uses at or near 100% utilization of 2 1080ti's in sli at 1920x1080 @60fps!
again, dependent on settings.
(10core/20threads with games settings at ultra with resolution scale at 2 for the above results)
This may/may not help you?
First off,
Attempting anything more than 1920x1080 with max settings is no more than a pipe dream with this game (2 1080ti's running near full tilt at 1920x1080 is proof enough of that for me!)
Reduce the resolution scale under advanced settings,
default is 1 max is 2 nfi what the minimum is atm. (setting of 2 uses roughly 4gig of vram)
My end (may be different for you?) vsync works better than the in game frame limitor.
Try both......Then off to the nvidia control panel & try its vsync/frame limitor..
Use the one that gives the best preformance for your setup.
Yep gonna get negitive feedback on this one, but its actually true....
Over 60 frames per second is pointless!
For the people who say otherwise. Prior to typing it, answer this simple little question first.
Can you see the flicker of an incandescent light bulb running at 60hrz?
99.999999% of humans cant, so over 60fps = pointless for normal humans!
Yes btw they do flicker (shut off/turn on) 60 times per second! (a fun filled fact for ya)
If you answered yes than you are not a normal human, so no need to point that out.. lol
Have also noticed around 6.8 gig of ram being used while in this game.
So assuming all max settings (including resolution scale) 8gig is bairly enough, add!
Check those page file settings, I've noticed mine hitting aprox 17gig while ingame.
If not already, place it on your fastest drive and/or one that dont also contain the game.
Better yet, use a 32gbps PCIe x4 NVMe M.2/U.2 if possible (5x + faster than SATA)
Play with those ingame/windows/nvidia settings until you find that sweet spot for your setup..
Unoptimized? Debatable at best, I've never seen a prefectly optimized game.
But this game does do a very stable 60fps for me! With 100++fps being possible..
As of writing this, aside from GTA V which can use up to 48gig of vram (frame scale to its max)
Which isnt even plausible with gamer level gpu's currently available.
Its the only game I've seen thus far that pushes that pair of 1080ti's to thier limits.
(yes, stock clock's) Games rather brutal on 'em! Witcher 3 dont even come close.
Just so ya know, bought the game directly from Ubi..
I could name a few perfectly well optimized games. I guess about as perfect as you could ask for. Left 4 Dead games, other Valve games, GTA V, Rainbow Six Siege, and DOOM all come to mind. Mad Max was excellent.
Ah I see now you're not at 1080p with the res scale at 2 you are actually rendering at 4k and in that case yes my 1080 ti's will be in the 90-100% usage and only 1 is needed.
The only thing odd i have noticed with the game which firestorm mentioned he had changed was the screen scale. On a game like Pubg i can max the screen scale to 120 so 20% increase and lose 5fps, but if i tried to do that in FarCry and set screen scale to 1.2 Ill lose an insane 24fps on average. My 980m is the 8gb vram varriant so its not like im running out of vram.
In fact, it's so well optimized that there's literally a thread that never sleeps and this constitutes about 40% of all measured CPU time in the game. That thread just spins a loop adding values (it serves no purpose). You can suspend said thread and your CPU usage will go even lower.
https://www.youtube.com/watch?v=WfdhG-LDXwA
Way more detailed thread analysis than you're ever going to find probably anywhere on this engine -- you're welcome :)
As part of the confirmation I have done, I have seen it happen on one system where Windows 10 task manager performance view does not accurately show GPU usage. But other tools which have long been the "go to" tools for gamers and tweakers do accurately show GPU usage, and using those tools one can see clearly that GPU is the most significant factor in frame rate. These tools include HWInfo, OCCT, GPU-Z to name a few.
The OP's main complaint is that when streaming/recording the frame rate drops significantly. Hard to say what causes this, but I would lean more toward the capture app being inefficient, or placing load on the GPU.
Using NVidia GeForce Experience, I was able to verify that streaming/recording with that does place the encoding workload on the GPU, and while doing this I saw a frame rate drop of about 10%. But disabling stream/record in GeForce Experience and using Open Broadcast Studio (OBS) and using "software" encoding, the encode workload is not placed on the GPU, but there is a negligible FPS drop of about 1-2 fps. Its hard to imagine hooking into video ouput not having any fps drop.
Regarding the light bulb 60hz, you can't see a light bulb which is running on AC flicker because the light bulb filament takes longer to extinguish than 1/60th of a second. The light is the result of heat, which doesn't immediately go away as power alternates at 60hz.
In the context of gaming, this is why it isn't terribly effective to have a rig that can do 100fps outputting to a monitor that has a slow response time.
I feel like you're accusing me of lying. I'm getting about 30-40% of total CPU usage while playing this game. If I record(WITH THE CPU IN OBS NOT THE NVENC in OBS or Shadowplay)or stream while playing, this shouldn't be much of a problem. If they CPU usage only kicks up to about 55-60% when recording, there is A LOT of headroom for the CPU to do anything it wants to in. Why such slow down, then? I'm still trying to figure that out lol
The Division is more demanding on the CPU, and I can record and stream at 30FPS or 60FPS with little to no slow down. The Division actually does take advantage of more than 4 cores, pretty effectively I may add.
There is a thread with many discussing that this engine has never really utilized more than 4 core very well, and OC'ing the CPU(because the engine is also a little dependent on speed of cores)can help as well.
I don't OC, so, I guess I'm screwed on that front. Seems it's just a Ubisoft/engine problem and has been for some time.
https://www.reddit.com/r/Amd/comments/8apajy/far_cry_5_is_too_much_cpu_limited/
Please read through that thread a little bit. This engine is known to not do well beyond 4 cores, and can sometimes be dependent on speed of cores/threads.
This is(for the most part)a very clean install of Windows. I have all possible updates. I've ran the update check many times in the last couple of weeks.
People always assume because they're not having any issues that it MUST be something with the setup of the other person, and nothing wrong with the game.
Do me a favor. Try RECORDING(not streaming)at about 16000 CBR while playing, and then see if you get any major drops in performance.
edit, I record at 30FPS as well. At least I have been as of late.
It uses D3D11 Immediate Contexts for rendering. You will always have a single thread (the one controlling the swapchain) doing most of the work.
The opposite extreme is Assassin's Creed: Origins. It spawns 8 deferred contexts and batches rendering commands across 8 threads. No matter how MANY CPU cores you have. AC:O is BAD, BAD, BAD. It does not scale properly and you have a bunch of lunatics insisting Denuvo is the reason, when in fact, it just needs a @#$%load more CPU cores than most people have.
Yeah. Those that keep blaming Denuvo for performance issues in games haven't done enough research. DOOM also had it and ran like a dream. Once they took it out, there was little to no benefit. Some also blamed RIME's issues on Denuvo. Developer admitted their water shaders were very resource hungry and even they struggled to understand why. Not sure if that was ever resolved.