Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
****************************
Totally concur with this! Whoever is defending this poor performance needs to get their head out of the ground (to say it politely). I've attempted to play this game on 2 different systems with acceptable to respectable specs... at every resolution... I've followed a couple different optimization guides... reinstalled my gpu drivers... I've even tried capping the fps to 60, 50, 45, 35, 30, & 27 (1080p). Even at 27 fps I still get horrendous stutter, we're talking 1% lows of 1 fps. It tends to be less often in areas with low to no NPCs, but one of this games primary mechanics is interviewing pedestrians. I hope a patch is released soon to address this as, aside from this glaring issue, I really feel this is a great game.
P.s. My game is installed on an m.2... I don't think the HDD is the problem.
****************************
Most people are obviously unwilling to take a clue.
One thing is absolutely obvious by now - people who are whining the most, are completely unable to
a. describe their issue adequatly
b. take any reference given by others and somehow identify their issue compared to what they are shown
c. describe the problem (with one exception)
d. understand whats already out there as an indication of whats causing the issue, thats not a very large issue at all
e. and they never post specs.
As a result of them playing "give me attention" in the first week after release, they dominated perception for a little bit - and as a result might have hurt sales. So at this point it has become a self fulfilling profecy, where the studio might be better off reacting.
Is there an issue there? Yes. Is it "horrible" - hell no. Its horrible on the level of you are watching digital foundry videos, understanding nothing of what you are shown - but getting a sense of overentitlement out of it, that everygame has to run with no jitter whatsoever. Is that a position you can take? Yes. Is it a viable one - well depends, in terms of priorization going gold, this (current state), would be my hundredth item on a list of things to fix.
Maybe that subjective evaluation is setup dependent, but people "suffering" are NEVER writing what their setups are, so even thats hard to qualify.
So what we have so far. Maybe extra shader calculation overhead, by multithreading not being optimized, shaders not being prioritized - non trivial to fix, dependant on what breaks. But thats not even that likely, because in several areas the stutter seems to be related to asset streaming, not shader calculation per se. (Still could be shader calculation mostly, if devs messed up royally (depending on when they flush already calculated shaders I presume...)) Devs have hinted at this (asset streaming as the issue) as well, by stating, that a (faster) SSD might help. Along the same lines, so might RAM speed.
Moreso than changing resolution, without changing texture quality.
(I have to do an intersection at that point, because some people were complaining, that the game doesnt run 4k @60Hz on their 3090s and thats the general resonse to that, because - no current game, that doesnt target the shooter market usually will produce stable 60fps on those kinds of cards, even though nVidia marketing suggested it. So taking resolution down to 1440p wou be "the solution" for those people in every other game - this one still drops frames on zone (cell) transitons. That doesnt mean, that you should now drop resolutions even further and expect greatness for you doing so. Or anybody being interested in your results, really - because its a threshhold thing. Not something that gets better the lower you set it.)
Then we probably have a group of people with differing sustaind SSD loading speed. (Nothing to add here, except that I stil have occational cell transition stutters on a Gen3 NVME SSD - but not horrible ones at any rate (though this has a subjective element to it)).
When we have people with older intel systems that should have amfar worse world streaming performance than AMD systems of the time, because AMD allowed for ram paired with their consumer CPUs to be overclocked, While Intel didnt - so there is literally a difference of as much as 2600 to 3200+ Mhz on DDR4 ram speed out there, on systems people might likely use, not even talking about ram latency.
Then there is the people that try to run the game on the equivalent of a 4790k on DDR3 ram, which doesnt even meet minumum spec. But some of the people complaining most also have 8 and more core AM4 AMD processors (so 8 full cores), so corecount alone isnt the issue, but certainly contributes.
So might be your CPU or GPU cooling setup because both dynamically throttle these days. I can add, that my CPU is undervolted, so throttled by default (to produce less noise in a fairly open small form factor case, on a noctua fan), so if you have an 8 core AMD AM4 CPU, CPU throttling probably isnt your issue.
--
Then there is the issue of microstutters and gsync - basically if a game microstutters, and you display it on a display without waiting a little bit (framebuffer), so without vsync - the severity of the stutter, or artefacting you see on screen might be higher. This is a "the devs should fix it issue" - on a game you expect to run at sustained 60hz/fps, but then again - open world games you dont always expect to at the highest resolutions. At 1440p on the top end cards you would expect the top end videocards to be sufficient for 60fps at least.
On this issue it might be beneficial to turn gsync off, and vsync on - although milage might vary. (Never tested that one really, just technically, it might make sense...) Again this is only if you are dealing with microstutters. (Not if you are dealing with framdrops down to single digit 1% lows.)
Single digit 1% lows - also dont tell you much about the "usual" performance of a game. This is why you post videos with framedata, to underline issues - usually. We have one of them out there (on youtube) displaying the PS5 version of the game - and its somewhat representative of the stutter I also see on my game - despite the last scene (most severe stutter), where the person who made the video went to the market area in scaladio, and reproduced, the worst cell loading stutter, thats in the game, repedeatly (running back and forth). So while that stutter is in the game, its not representative of the overall experience. Other instances the person shows in that video are (usually microstutters, but ocationally more than that). You can also watch the video I posted (with a timecode of me running through the city with framerates around 50fps avg occationally dropping to 45, almost never below) - sadly captured at 30hz, but "horrible" is not a word I would use - in that video, nor in what I see on screen.
Which brings us to subjectivity in assesment, and theres really nothing you can do to solve that.
But then again, the people who are most vocal about how horrible this is, also are the ones that fall into "well - it might be good for you, but its not for me, so devs should fix it" logic, which is always fun.
Then there is another "compensation" option available to people with OLED TVs (or similar price level) - at the cost of increased latency (but only slight, not game cripling), and that is to use frame interpolation "clear", with all other image enhancers disabled on a true 120hz panel (Brands usually dont advertice the native refreshrate of their panels, you have to look that up in databases of panels (thats not newegg (or similar)). Which will increase motion clarity (= reduce smearing), by calculating in between images of what they are fed in the signal, which might also compensate for micro stutters. I mention that, because I have that turned on.
With all that said, in the video I posted ( https://www.youtube.com/watch?v=zqz5UoO9t34 ), I count around 9 or so stutters (again video is 30hz, game 45-55fps usually (1440p)) in about 4 minutes running through several different districts of the city, including scaladio (which is most prone to show them). Half of them I dont notice during gameplay, if I dont look for them specifically, again - subjective, but they are microstutters, three or four of them severe - but never so severe, that I really started to mind them during 80 hours or so of playing the game (3700x, 3200mhz DDR4 cl16 ram, WD Blue SN550 Gen3 NVME SSD, 1660Ti - 60hz cap, vsync on, and motion interpolation clear on my OLED.) And by running through the city, I mean running through the city - sustained. Not stopping and looking, which usually produces less stutter by minute... But thats judging from a controller gameplay profile, so I'm not rotating the camera around in quick 320° pans, while running.
So somewhere in between this happening, and the game being borderline "horrible and unplayable" lies the reality.
Also for everyone shouting "devs fix this" - my response would actually be - please dont, if it means reducing the visiual quality of the open world in any significant manner - because that still was one of you main assets in this game (and with stutter we usually talk about "in the open world portion" of gameplay - not in indoor areas). Also, please dont, if it would require many manhours to rewrite stuff, if you dont have the manpower for it. The game isnt broken or unplayable or distracting because of it by any (ok my...) reasonable standards.
But then I dont watch digital foundry videos to get annoyed by frametime drops, dont have a low latency gsync monitor close to my nose, and not a mouse acceleration of three times camera rotation around my character, every time I move the mouse by two inches.
Stop blaming the user, it is an issue that the developers need to fix and is obvious to anyone who plays with their eyes open. We don't do the developers or ourselves any favours by meekly accepting it because they'll keep doing it in subequent games, someone mentioned this issue is carried over from the Sinking City for example.
By the way, I am using a Turtle Beach Recon wired XBOX controller not keyboard and mouse.
I have a Corsair MP510 960GB NVME SSD PCIE gen 3 with high sequential (3500MB/s) and random read speeds (2500MB+ with 4K size at queue depth 32).
I also have 32 GB of 3600 MHz CL16 DDR 4 RAM which is hardly slow.
My AMD 3900X 12 core CPU hotspot temperatures barely go over 60C and my Nvidia RTX 3080 FE GPU core temperature is below 75C. I know how to look after my hardware.
I literally play all my games at 4k as that's my tv resolution and the majority work absolutely fine, I even state that I've lowered the resolution for this game and it still happens.
This isn't an insult to the game, I love the game AND the devs, however there is no excuse for the shocking performance especially considering it's been out for a while now.
Again, this is not saying, that those stutters dont exist, they do - this is merely taking a hammer to the argument that "should be able to play games at 4k60, because I payed money".
"I can play the majority of games at 4k, because thats the resolution of my TV" - is what? A mantra?
You understand, that with the console cycle switchover (once they populated enough homes, so sales figures are at that point, which because of the chip crisis will be pushed back a little - so good on you for having bought this generation of "best grafics card ever" (oh no... 3090Ti got announced), you are the best person in the universe, I'm so proud of you...)? You understand, that with that comes a new design paradigm, that will create worlds that are "harder to run" at four times the resolution of 1080p? Great, thank you for following me on that baseline.
Next is to have you name one Assassins Creed that runs at 4k60 on a 3090.
https://samagame.com/blog/en/assassins-creed-valhalla-doesnt-run-in-4k-and-60-fps-even-on-an-nvidia-rtx-3090/
https://www.reddit.com/r/pcgaming/comments/jpxoay/rtx_3090_cannot_maintain_60_fps_in_assassins/
https://www.techspot.com/community/topics/assassins-creed-valhalla-cant-hit-60fps-4k-even-with-the-rtx-3090.266155/
https://wccftech.com/assassins-creed-valhalla-cant-keep-4k60fps-on-an-rtx-3090-gpu/
And the next thing is to realize, that current open worlds will never be designed to run perfectly at 4k60 maxed out on a 3090 because 3070 performance is "comfortable 1440p60 at maxed settings" in current games (3060 is not), and 3090s dont even offer double the theoretical TFLOPs performance of a 3070. Now, while its still gonna take a while for 3070s to become standard, development is already heading that way.
When Sony for example speced for the port of God of War it came out to reaching 1440p60 with minor dips (57fps) on a 3090, with DLSS enabled to scale the 1440p to 8k. DLSS is a postprocessing effect (mostly) handled by tensor cores, so it would not impact core pixel fill rate normally.
https://www.techradar.com/news/you-wont-need-a-ps5-to-run-god-of-war-at-8k-youll-need-an-nvidia-rtx-3090
@4k60 with max settings that game also, never reaches a stable 60fps on a 3090
https://www.pcgameshardware.de/God-of-War-2018-Spiel-57369/Specials/PC-Benchmark-Anforderungen-Test-Review-1386875/2/ (And thats a PS4 port from 2018. Granted, a suped up one - but with new open world designs, guess what happens...)
Basically any game with open world character, reaching the visial fidelity of a current gen Assassins Creed, God of War, or Sherlock Holmes Chapter One (?) - already - has no chance in hell to run @4k60 (sustained) natively (it does with DLSS, which is a fancy post processing scaling effect - thats only available, if nvidia runs render passes for the developer on their GPU render farms, pre release) at maxed out settings, on a 3090.
And "but most games do" doesnt matter, if you had a generational switchover, and studios are targeting 3070 performance as 1440p60 with maxed out settings as a default. (No one tagets 3090 owners, because 1% ownership rate doesnt matter if you try to build the next generation of video games).
3090 Owners should still be able to run games @4k60 at PS5 Level visual fidelity - but only as long as PS5 games target 1440p@60 locked. They already dont. In the future, they wont. And PC players will have the expectation that their games should look better than on console, which brings us to "no way in hell - is the 3090 a competant 4k60 (with maxed out settings) card for new generation titles.
If you sold your left kidney for one - thats on you. The only thing I can say to make it hurt less is, that Nvidia marketing really tried to sell this thing as a comfortable 4k60 performance level card for the next (current) generation of gaming, when it was not.
https://www.dsogaming.com/news/modder-fixes-batman-arkham-knights-stuttering-framepacing-issues-via-directx-11-hooking/
edit: Game was developed on UE3, so its a bit of a longshot, but hey... :)
edit2: Best video that shows off the frame pacing issues in this game so far:
https://www.youtube.com/watch?v=IwDICThngRk
Run on a Ryzen5 3600. Running it on a 3700X or a 5600 might improve things a bit, or not. But here you see it being a frame pacing issue (frametimes), while average framerates are not impacted by that much.
edit3: From the first comment of that video:
"P.S: The game runs on dx12, although the system requirements only indicate dx11. During unsuccessful attempts to find a way to improve the stability of Sherlock Holmes, I discovered that the game also has a hidden version of dx11. To launch dx11, you need to use the command in the launch option steam: -nosettingslimit -dx 11. Dx 11 version works worse than dx 12. And it looks like it was hidden for this reason."
Thanks! It worked and now I can finally play the game!