安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
Even though it is doing some pre caching on first launch, it was all of 10 seconds worth so there's no way it compiled all of the textures.
I'd rather wait 10 minutes and have it go through and do that once on first launch, and there would have been no stutter.. same with the latest Jedi game..
Comparing FSR3 quality mode and frame gen vs DLSS quality and Frame gen, will see if FSR is any better.
It seems to be a major issue with EA PC ports using Unreal engine.
Trying now with also disabling E cores to see if it helps, and maybe disabling Nvidia Reflex.
Sigh.
I mean I won't even get to play the game in the 2 hour test period as it will all be troubleshooting LOL
I now have the following settings.
Ultra everything., even though it claims it needs 9 hundred and something points and my GPU is under powered at 8 hundred and something. Nope, works rock solid.
4K borderless window
DLSS quality mode
DLSS frame gen on
88 FPS average with a 4090M, always over 77 ish.
With Frame Gen off, average is 62, lowest I saw was 57. If I drop everything to high, 4K, DLSS quality, would be fine and well over 60 at all times. I tried 1440P out of curiosity and the difference was massive and it still looked WAY better than the console versions, I was in the hundreds and at points maxed out my refresh rate of 144.
Just followed the lady in the first act and went a few minutes further than before and no issue (went further to see if new areas would cause stutter).
A bit of a ramble about consoles vs PC if you care to read on:
I find it interesting how a current console can not maintain 60 FPS with this game, with much lower than ultra settings (a mix of medium and high) and FSR2 performance mode, literally 720P base resolution. And it's not even up-sampling to 4K all the time, it's dynamic, and FPS can go from 40 to 60.
Now yes, my laptop was a lot more expensive than the current consoles, but I did some digging and you can get a laptop that can match the consoles in every way performance wise, never lower performance, yet often better, for not *that* much more, if you are willing to run 16GB ram and a 1TB SSD and say a 4070M, or even less money if you get a 4060m and say an 8 core CPU rather than a 16 or higher core like mine which is 24 cores/32 threads.
The 4060M is around 3 to 5% overall more powerful than a 6650XT desktop which is the equivalent of what the consoles are using. Even the 4060M can do everything a PS5 and Series X can do. I was able to match every setting and never have worse performance than the PS5 with my previous 3070M legion laptop also, and that was outdated when I bought it.
BUT.... for that extra cash, you also get a full computer, an included display, true portability, choice of any controller you like, a gargantuan game library, STEAM, and so on. The benefits of a computer are huge vs a console.. And the current consoles have proven themselves to be SO grossly under powered for any AAA title from 2021 on wards (none do a native 4K 60 except for maybe 1 or two titles in the entire AAA library LOL, and 4K 60 was supposed to be the norm for them). They were only really useful for 4K/60 gaming for previous console gen games, as an upgrade.
And this seems to be the trend.. Next gen will claim 4K 120 or 8K 60 but it will only do it for current era games at best, and so on.
I hadn't bought a console since the PS3 many, many moons back, and then suddenly got a switch, used it for a year, now I don't even know if the battery charges anymore.. once I moved to Windows from Mac, never turned the Switch on again. I have ended up double dipping on almost every game I had on Switch, as the experience is so much better on Steam, and for the exclusives, I use emulation anyway (as long as I own the original cart, obviously, or eshop license) so I can play them in 4K and many 30FPS games have 60 FPS mods. Emulation for Switch is a far, far superior way to play any emulation compatible game than the real console. Go figure.
Then like an idiot, 4 years later in 2021, I bought the PS5 and Series X.. However...The series X has Rare Replay which is a huge thing to me, not available via cloud streaming, and something I will never ever want to lose. So for that, it's worth it. And I can almost justify PS5 for Demon souls and a few other rarities, but even those I remote stream to the laptop. Seriously. And there's not many of them.. I mean even PS5 exclusives that I get for "free" in PS plus, I bought on PC anyway (if there is a PC version, obvs) as they play so much better.
Was playing Sackboy earlier, was far superior to PS5 version as I was able to lock 120 FPS and 4K (yes with DLSS quality mode) and PS5 version has no ray tracing, is 1600p ish. No contest. Just an example.
My whole point is that I am struggling to see the worth in the current consoles. I think if they were say, 399 AUD, tops, instead of 800 AUD, yeah, I can see it. But I see laptops on sale for 1200 AUD that can match or beat them and it doesn't make sense to me at all.
Then I saw one the other day for 1299 that would obliterate the series X even.
I mean when you are spending 800 bucks on a console, it's not that much more to spend 1300 and get an entire turnkey computer, IMHO anyway.
Why do people still go crazy for consoles like they do? Is it because they make silly decisions like I did when I bought them? Or maybe the plug and play nature perhaps?
Anyway, way on a tangent there, but seeing this game, Aveum, in particular, and just how horrible and aliasing it looks on consoles, truly awful, vs PC, and I had to get it all out!
I’m lazy and on big picture, if the fix isn’t in conveniently in properties (like atlas fallen) I’m screwed lol. It’s my day off, I’m not leaving this couch
-force-d3d11
In the launch options.
From my understanding it switched the game to direct x 11.
Well game is running smooth now. And I turned my graphics settings back up. Had the same issue with atlas fallen and it fixed it, guess direct x 12 is doodoo for some stuff.
The joke's on you. The vast majority of stutter posts that I've seen on the internet (Steam reviews or any other site) contain an AMD CPU. I've had only Intel CPUs and I've never had stuttering on any of my systems.
Not really true as a general rule with Intel vs AMD.
I haven't had an AMD CPU ever, except for my handheld Ally and that's been for all of 2 weeks.
I currently have a CPU nearly 3x as powerful as yours, with all 8 P cores (13th gen, DDR5 64GB) sitting at 5.3ghz under load with this gaming running, and all 16 E Cores sitting at 4GHZ without a hint of power or thermal throttle. GPU, we are tied with 4090m vs your 4070ti desktop, but I do have more cuda cores and ram and in this particular game my score is equal to a 4080 desktop (I have the 4090m) as I have tested it against multiple 4080 benchmark videos for Immortals of Aveum.
Watch the Digital Foundry video on super powerful hardware that make both of our systems humble, and how this game stutters. Not egregiously, and not shader compilation like typical UE games, but it's there in certain scenarios, and after more exhaustive testing, I am experiencing it the exact same way they are. It's not game breaking by any means but it's there.
Unreal 4 and 5 games are notorious for stutter, on even the most powerful PC's. There are also some games, e.g. Jedi survivor, that can not run smoothly on ANY PC at any setting.
If you want to get technical, AMD cpu, gpu and chipsets generally have way lower DPC latency than Intel/Nvidia combos, and low DPC latency is actually a big plus for stutter control as it stops any erroneous peaks from occurring for a micro second, as it stops any one driver taking away so much time from another that there is a momentary drop out. You can use latencymon to see how yours fairs, at least a half hour of running in the background is needed to be reasonably accurate.
PC's are fickle. Sackboy was supposed to be one of the worst stutter fests on the PC, to a point it was unplayable and not worth having over the PS5 version even with the Ray Tracing upgrades, yet for me it's been butter since day one. YMMV. it's components. It's PCs. ♥♥♥♥ happens.
All that said,you were clearly being baited with a troll post specifically trying to get a rise out of Intel users, so, I probably would have answered the same. I have learned to just mute and ignore TBH.
13900k and RTX 4090. Getting framerate drops/stutter a LOT.
Intel 13700K(F)
32GB RAM
Nvidia RTX 4080
My tips are have Nvidia ReBar on. That fixed it for one user.
Don't let your CPU be the bottleneck. Use e.g. DLSS, or a frame rate cap, if your CPU is maxing out threads.
Make for absolute sure V-sunc is turned off the in the settings menu. In settings> Display> double check v-sync is off. V-sync somehow manages to turn itself back on after I try frame generation.
Well you see the issue with that is I have a Ryzen 9 and I'm having the same issue
This helped, thanks, running smooth now
If you force DX11, does that turn off ray tracing? Or is software RT not specific to DX12?
I thought IoA had RT on all the time, even on lowest settings.
For example, I only see RT options when I launch DX12 versions of games that have it and run with both DX API's.
This is why I run Ghostrunner 2 in DX11, cause DX12 has stuttering, especially if RT is on. DX11 smooth as silk and I don't want to be tempted to turn on RT lol.
what do you mean you used the controller and typed it in from the launch options? what launch options? im trying to find out what you mean here but im not following.