Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://youtu.be/BHEXhiRZuuE
Unfortunately, I have a 1080 monitor, so while I can run 1440 using AMD VSR, I can't trigger RSR without setting in game resolution to 1024 or something. So it doesn't work in my situation, but it's cool none the less.
yes, the game doesn't have that option, so I had to use the driver option. maybe try setting 720p ingame if it will work? i remember someone upscaled Cyberpunk from 720p to 1440p and it still looked tolerable
How is performance on the campaign map?
Yes, campaign fps at ultra averages 40-45 ish but too much fast scrolling and it's easily 20-25 here and there. Said it before but separate settings for campaign and battle would've been nice. I'd happily have the former at low-mid and the latter at high-ultra (tweaked) where it's at least 60 fps and more stable at ultra etc...
I actually also play this on a 3070ti laptop at 1440p and there, while the fps counter doesn't register those campaign map drops as much (a monitoring run with something like HWinfo would probably tell more) there is noticeable judder even down to medium and capped. Again, not so much a surprise as the 6800XT uses a whopping 13-14Gb VRAM... and I'll imagine/assume it can easily hit between that and the 3070ti's 8Gb for 1440p. But it's a gaming laptop and having a few before I'd be tweaking settings and capping fps anyway to keep heat and noise down etc. Even more so here though, separate settings would be great.
I guess I'll try (when I have the time) RSR on the 6800XT though tbh I've not used any upscaler once yet having made sure to buy a GPU with more than adequate VRAM for the res. Nor do I know whether that (or even Sapphire's Trixx) will scale ok with an ultrawide 21:9 resolution as all options I've seen are 16:9 ones. But yeah... 60 fps or so ultra for battles is fine and dandy (I had half that for some of the lifetime of WH2) and high is as good for 10 fps more but that campaign perf is awful. Even if it were 30 fps and at least as stable as the battle side I wouldn't mind half as much but jumping up and down like it does on the desktop or juddering on the laptop, ugh.
I also tried RSR with the game but that has the problem of not allowing Radeon Chill at the same time so the temperatures spike up quite a lot on the campaign map. With the built-in FSR 1.0 you can use Radeon Chill.
Though I ended up running the game at max settings at native 1440p with Radeon Chill set to 30-60 to cool things down.
I haven't used Radeon Chill yet with this one (can't recall if for anything else but on a week by week basis I only tend to have a couple of games on the go) Still, average temps AAA gaming run 65-75C on the 6800XT, usually not more than 70C, for 99% use, 12Gb average so far (at 3440x1440)
However, haha... this game will run up to 80C (100C or so hotspot/junction) and 13.5-14Gb used (and near the same for RAM fwiw) at ultra. Meanwhile the 5800X is using 20-30% at 4.85GHz and around 70C, cooled by a 240mm AiO, pump and all fans on quiet. Well, so much for the old idea that TW games are more CPU heavy than GPU hogs I guess. I do know, from earlier testing that a my gf's 5600X and my 6800XT is nearly no difference than compared to my 5800X and her 3070. For most games the CPU and GPU match pretty evenly for temps at least, if not % of utilisation, though I've had what you'd assume are less CPU/more GPU heavy games hit more than twice the CPU use of this one.
Tbf I don't mind the temps overall though between 2021 and now they've gone up some 5C or so across the average of games I play, probably to match usage vs requirements and etc. Nm of course that most AAA's, bar well known outliers, run some 20-40 fps faster. Again, will try Radeon Chill as well as RSR/other options but it's not a dire necessity just yet.
I can mitigate the desktop temps easily enough. Either upping the fan/AiO pump to normal, capping fps etc etc... no biggie though ideally (or for any other game that's not such a heavy load/poorly optimised) 75C would usually be my limit but I've decided to tweak settings down from straight ultra and cap fps a little more to see how it goes, 45-60 fps and/or high/mixed should be ok and drop temps to 70C or lower.
However, getting temps to stick to 75C on the laptop takes way more cutting back.
On the bright side it's a 1440p 15.6" screen, so amazingly sharp and still better than, say, the 1080p 17.3" the last laptop had. Tbh had this model (Lenovo Legion 5 btw) come with a 1080p screen it would've been a better fit for the 3070ti (full power laptop version, yes, but still more like a 3070 at best) However, I got this model (6800H, 3070ti, 16Gb DDR5 4800, 1440p 165Hz) for up to £300 less than any other Legion 5 models that had a lesser CPU, GPU, screen or mix of all three, so it is what it is.