Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I don't have the game on steam, I played the DRM free version (GOG - should run better if anything). My PC is a 6700-K build: i7-6700K (OC@ 4.7ghz), 16GB RAM @3200 CL16, 950 PRO NVME SSD. When I first played the game, I was using my 1080 TI Amp Extreme that I've been using to play just about ALL of my games in 4k/60/vsync for the last 5-6 years (I don't play many AAA games - mainly A/AA indies but a few bigger ones like Witcher3/RDR2/GoW). After seeing the "Recommended" specs of an i5-6400 (non-K) @2.7ghz + a freakin' 1050 ti, I was like, "hell yea! For SURE I'll be able to play this one at 4k as well!" "(If I can play witcher3 next-gen edition at 4k/High @60fps+Vsync w/ Quality FSR, as well as Pathfinder1 at 4k/60/ultra and pathfinder2 at 1440/60/ultra then surely I can play THIS at 4k!......right?").
I set the game to 4k/high (minus reflections) and quality FSR - looked great for a few moments, until the camera panned down to the huge room, then dropped briefly to 40ish. Up on the first platform I was getting 60fps and it looked fantastic walking around. Then, boom, as soon as we get to a "cut-scene" (basically brief auto camera movement minus HUD) it drops to 30. Once combat starts, constant drops to 45/30/50. Each turn swap and each action taken. Turned settings down incrementally, until I was at 1440 medium with FSR. No dice. 30-50fps when there's any action whatsoever or combat starts. I stop playing, as I'm getting a new card soon.
Fast forward to last night. I installed my 7800 XT in the same system (replacing the 1080 Ti). Oh, boy. Doubled my FPS (almost literally) in every game I tried so far - TW:WH2, W3 next-gen, 3DMark TimeSpyExtreme, RDR2, etc. There were some games I was SURE I was probably going to hit a CPU issue in, but nope, even with vsync off and at 4k my fps was going into the 100-180 range in stuff like W3 nge. I was SURE I was gpu capped in Rogue Trader, as it was HAMMERING my 1080 ti while cpu was just getting hit avg, like in most games I play. NOPE -> Same exact issue as OP. SAME. EXACT. Result/reaction.
What is going on? I am anal about keeping my system squeaky clean and lean. All updates are done. Drivers, etc. All good. What is it about this game, and WHY on earth are the requirements so LUDICROUSLY low in comparison to most other modern games of the same type? BG3 recommends an 8700K + 5700XT/2060Super and my 1080Ti+6700K was playing, with FSR, at 4k/60/high (minus a few tweaks like reflections).
This is a crazy long post, and I apologize for writing a dissertation on the subject, but I'm still new to WH and especially WH40K and I really really wanted to play THIS game to introduce me to the world. At this point I'm convinced there must be some 3rd party tool or obscure (seemingly-innocuous) setting that's interfering and I cannot figure it out. It MUST be....right? Both OP and my own systems TROUNCE the recommended settings, yet we can't play at 1440p/high with UPSCALING and expect a solid 60 in a game that (as both OP and myself describe to others) as similar-looking to Torchlight2 in many ways? (literally, I said this to my brother weeks ago when I first played it and had these issues). I don't have these issues with either of their Pathfinder games, even at much higher settings - and at least the 2nd one is better-looking.
EDIT: I'll quickly explain my reasons for not going to 1080p. First off, I use a 55" (4k) Television as my display.1080 does not look right in 99% of games in which I've tried it. Secondly, I have a freakin' 7800 XT and a CPU well above recommended power. ;)
They are just there to help people considering a game purchase to rule out any game that will perform badly on their computer - and when I say "badly" I don't mean lower fps than desired or hot temperatures, I mean unable to display the game or GPU crashes.
If your CPU is better than the recommended GPU then you should have a better-than-average experience with the game, but if the game is just not running well on any GPU then you might still have an experienced that you do not find satisfactory.
I didn't notice any posts lauding the optimization of this game... but then I don't read reviews, so maybe that's why.
If I had read them, I would have responded to laugh, because no, this game is not well optimized.
That said, I'm playing and enjoying it just fine on an 11-year-old computer with a 4-year-old GPU. I don't have the graphics settings maxed, and I'm not trying to play in 4K, because I know my system can't handle those settings. People with newer computers and newer GPUs will probably be able to handle higher settings and resolutions than I can, but there's going to be a limit no matter what hardware you've got.
The key is to dial in on that limit and play under it. Trying to play above what your system can handle is futile.
It is fair to give Owlcat feedback on the performance of their game, but I wouldn't hold your breath for performance optimizations in the short-term. They are still trying to fix game-breaking bugs and broken quests.
Also, both I and PROBABLY the OP are well-aware of what "recommended" means (or can mean)... but I have been playing games for quite a while and I have HUNDREDS of games on steam and many, MANY more off of steam. I have a really good idea of how a game "should" play on my system based on both recommended specs, reviews, footage, and other things (engine used, developer, etc.). There is something wrong here.
You mentioned not noticing the plethora of comments about great optimization. That is the main reason I am even posting this. That, and, again, the recommended hardware the devs mentioned. This isn't a case of being borderline. These specs are WAY below what we're using.
And players certainly CAN do something. If it IS some 3rd party tool or obscure setting that happens to be causing this (very rare) type of issue, then anyone can provide a "fix". I'm not saying that's what's going on here, but I do at least somewhat suspect it. You never know, and it can't hurt to ask. Maybe there IS a "fix" and we/I just haven't seen it yet...
Amazing. Specs?
As I said, the most recommendations can tell you is that the game won't run, or will run badly. They can't tell you that the game will run well, nor can they tell you how well it will run.
16gb ram, Nvidia 1080 6gb
I can run titanfall 2 but 20 exploding nurgle cultists and the game is running like a potatoe.
If it helps, I don't use GeForce Experience, just the drivers. I have a sense that it causes more problems than it fixes, but ymmv. I stopped using it very soon after they first introduced it, because it was causing some complication for me (though I don't recall what, because it has been many years).
I'm now on the 7800XT, though, and I AM using their FULL driver software package (in all of its ~2.5GB glory - that's in active memory). I'm in a little bind, as in no way shape or form do I want to continue to use this into the future, but without installing the FULL suite it looks like there is no way to get many of the ACTUALLY useful options that program includes. I'll just have to find alternatives, or use the driver slimmer and continue with Afterburner.