Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
And locked fps at 90
Never dropped below 88fps which is because it was locked would be getting above that
Specs are 12700k, 3080ti, 32gb ram
Oddly enough running at regular 60FPS and high graphics worked fine for me for some hours, but the moment I got to Apollo's Temple I started getting black screen due to display driver not responding. Now it appears to do it at any cutscene I get. The only upside is that I don't have to reset my computer after adding the TdrDelay in regedit, but the game does indeed appear to have a lot of issues :(
Even MSI Afterburner graphs clearly show the game is not straining at all on my 1060 GTX.
I knew I wasn't the only one, but it's nice to get a confirmation - every AC game runs like ass for me too (I:FR also does) but Valhalla runs like butter.
Every AC game, from Unity to Valhalla, uses the same engine. So if Valhalla runs so good, I really don''t understand what they did to it.
I think most of you use dual channel already, so i don't know what's going on, but for me, as a laptop user, it just feels like I'm playing in a completely different PC. So, if you're a laptop owner and if you have an empty RAM slot, think about upgrade it.
Does that change what i said? Not really. I still think Anvil isn't a very well optimized engine and Ubisoft isn't well known for their PC ports.
I'll probably test AC in the future, but that game is too big to download again, so I'm not sure if I'll test it soon.
edit* system specs are ryzen 9 5950 64gb ram 3070ti on the linux system
intel 10850k 64gb ram 3080 12gb on the Win10 system
Settings are Ultra on both systems. Performance is comparable, definitely smoother on the higher refresh monitor at 1440p as far as fps goes, but I'm not seeing lag or tearing at 60fps on the 4k screen at all.
My take on this game is that it scales fairly well across Switch, PS4/Pro/PS5, and really can tax a 2nd gen RTX card unless it is a 2070/80 series at 4k. It's definitely making the 3080 work hard at 4k. I wouldn't expect my 3070ti to do as well as the 3080 at 4k, seems like the sweet spot for these cards is really 1440p with high refresh.
As far as previous AC games go: I have zero issues with Origins - it runs smoother - and Odyssey - more refined than Origins - on both Linux and WIndows. I can't get Valhalla to run on my Linux box, not sure if it just can't handle the Nvidia driver or if it is making function calls to video card libraries that just aren't in Proton 7. On Windows 10, Valhalla runs very well.
I think so too. Every Ubisoft game is CPU intensive because they're mostly open world games. Like i said, after putting more RAM on my laptop in dual channel mode, the FPS drops stopped. I always used Afterburner to monitor my GPU and CPU usage but i never saw how much RAM these Ubisoft games use. After adding more RAM, games like Assassin's Creed Odyssey, Origins, Far Cry 5, God of War, Cyberpunk and Ghostwire Tokyo had improvements in their performance.
This is likely more to do with your memory stability as opposed to your GPU load.