Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Thing is, I've looked up a bit and those games at the time it was launched that demanded the most of cpu were strategy games like Warhammer, Total War, you know? At that gen, I very much doubt you would have a problem at all with a Ryzen 5 3600 if you aren't gpu-bottlenecked, and you're not. They demand fairly low CPU usage for each core, the only thing that's changed since the PS3-gen was the multi-threading trend but still it didn't need a great clock speed so definitely not trash for gaming purposes, far from it actually.
The one I have is a bit more outdated, it's a Ryzen 5 1600, 16 GB RAM (one of the fastest sticks) and it's got a frequency similar to my previous CPU which totally rocked at the ps3 era, an Ivy Bridge I5-3470, and it wasn't even close to the fastest ones at the time but it still ran the games from AC Valhalla's gen pretty well, just so you have an idea.
So lacking any suspects, it's pretty safe to say it's badly ported to PC. I just came here to reply after it crashed in fact, it wasn't the "verify integrity" because I left the files intact since I verified it recently, tried deleting cache folder in documents, erased most of the save games, disabled ubisoft overlay and it still crashed at 30 fps, without any noticeable struggle to run it for the whole relatively short period of gameplay, I was monitoring it with rivatuner, so, temp, CPU and GPU processing usage, GPU memory, RAM, all working like a charm at 40%. Since there's no fix for bad game design at least with the extent of mod compatibility that this game has, which is none at all, I'm even considering uninstalling it and, to be honest, it's not that great of a game. Seems Ubisoft felt the need to merge AC, Dark Souls, Witcher 3 and weird Monster Hunter mecanics into one mess of a game that doesn't even work and hasn't its own soul. It's beginning to feel more and more like a waste of my time with the repetitiveness the series always used to have anyway.
Venting aside, hope it helped.
Ryzen 7 5800X3D
32GBs DDR4 3200mhz
RTX 3080
This game runs better than Starfield, and also better than Witcher 3 "next gen".
Uh, a 1660 is not a good GPU. That's a several years old low end card.
i play with the a 1660ti on max without any issues..
64gb of ram
i7 9700
so yes your CPU is trash, and get atleast 32gb of ram.
CPU dependancy is literally low on most games, it's common knowledge at this point. I very much doubt you'd need a i7 if it was a decently optimized port, seeing as it doesn't rely on AI anymore than what you'd see in GTA V.
I can agree with the gpu thing if the OP is using a > 60 Hz monitor and not playing at 1080p but in regards of the CPU, it's literally overkill. Any good enough GPU is way better at processing graphics than the CPU, so it really comes down in great part to the AI. Anyone can test that for themselves, makes no sense lowering shadows and antialiasing and still having poor performance in a game with low usage of each cpu core, so it's not CPU-related.
People can set it to all the fps they want, there's many animations and effects that were done capped at 60Hz. Some are even lower framerated. just looking at a flock of bird flying at 10 FPS in the sky, cloth physics at 30 fps and fire animations.
It's already been discussed and there's a video explaining that in detail on digital foundry, but it's always funny to see some shill clowns here say the game work for them at 100+ fps, they need new glasses.
and yeah, 120 FPS and stuff like that is usually stuff that's been created to attend demands of responsiveness in competitive games, probably started with FPS games on PC, and console gaming still remained practically the same with respect to performance, aiming at 30 or 60 like you said
but you can see that with AC IV for example, usually when they're clearly ports made to PC that are 100% designed for consoles the companies tend to want to hardlock the fps at 30 or 60, and they were launched for pc because "why not?", not that it's a bad job at all in general but with Assassin's Creed it always was a pain in the ass the more console-like it got in terms of that.
It's like with crowded places and/or with environment elements you could see that in Origins (Alexandria is often brought up) and Odyssey where it drops to 45 fps and the discussion of the machine not being powerful enough only reiterates that point, because if you were aiming at a console-like experience but at 60 fps and 1080p you'd hardly need a better gpu than a GTX 1060 or equivalent, and you don't need to max out certain things, and if the public really cared about hyper-quality more than performance the games wouldn't have dropped the option to use Super-Sampling AA and the default AO technique for Valhalla is clearly plain SSAO, in spite of it not being changeable.
Even early next-gen is playable at lower settings at 60 fps if they're optimized well enough, like they did after patching the new Witcher 3 for performance. Haven't played Baldur's Gate 3 but it's a similar thing and you can still have fun without the need of Ray-Tracing. Sounds more and more like PC-masterrace toxic garbage-posting to me and it's a crowd wank.
I've streamed the game for more than 150 hours on twitch and the most stable was with the game capped at 55 FPS 1080p with the performance settings described here.
https://www.youtube.com/watch?v=m9LgfTr517c&t=767s