Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The PlayStation 5 is equivalent to an Rx 6700 xt or 3060Ti /3070 All of which are a little bit better or worse than the 1080TI. Give or take a few percent. The 1080 and 1080TI were monstrous cards.
I still love the game and I'm still playing it.
But I have a 9850X3D w/ a 4080 Super, 32GB DDR5.
I still have massive performance issues compared to how well it SHOULD run. I'm still able to get 90-120 FPS w/ Frame Gen and DLSS Quality, Raytracing OFF, 1440P
Meanwhile, I can play Cyberpunk with over 450 mods adding 4K textures to everything. Pathtracing maxed and some of the best looking graphics I've ever seen in a game and I still can get 120FPS+ @ 1440P.
This game is very poorly optimized, muddy, shiny, and very bad issues with frame times and stutters. They're lucky it plays great otherwise this would be even more unacceptable.
The problem isn't even how it runs. It's how it looks bad and also runs poorly and basic things are missing and bugged out and not functioning properly.
Not true. I have a 4080 super and the high res texture DLC does not fill the 16gb vram. And even when I drop my texture to low I gain maybe 10fps
If you record your screen at 4k in base camp rotating the camera the fps will say XXX, but the REAL frame rate is lower, with frame gen off.
You can see microstutters on Highest DLC textures + 4080 super even if the framerate isn't showing it again at 4k resolution.
10 fps ~10-40% frame rate btw...
That's the biggest performance boost you can get in settings.
I also have a 4080 super.
Stay delusional though,the game is perfectly optimized and their own anti cheat Really is not the issue as you can see. /s
nice try!
just compare it with the 7 year old game forza horizon 4, 7 year old game
running on for example
i7 3770/xeon e3 1230v2 , 13 year old cpu architechture
gtx 1070, 9 year old architecture
doing fhd 60fps min, all seettigns maxed
looking better then mh wilds all settings maxed
while wilds needs around 3x as powerfull hardware , framegen, upscaling to even get to 60fps fhd
similar looks while that deos not equals max settings ther, fh5
oh and have fun comparing the open worlds of fh55 maxed settings with mh wilds
i do not mind if it is unoptimised code
faulty code
paranoia-mode copy protection
or
paranoia-mode antitamper, to protect sales of freighterlaods of cosemtic dlcs,....
oh whatever it is
that ruins a game that otherwise could be a superb one
stop treating pc-gamers as criminals, as enemys if you want to sell games to them
learn from hello games jsust as one example
Of course the OP is, they're just farming for points....don't feed them...
Here I am with a 2080ti and getting 60fps on high settings, and ray tracing at 1440p because I spent the time modding the game to actually run properly ... DLSS4, Direct storage dll's, RE framework, optiscaler, ect,
This crap is inexcusable by capcom and it's clearly not optimised since I can basically resolve most of the issues with these tweaks,
Also regardless of all this the textures and environment detail in this game still looks like crap so your shilling that its a modern game and and needs a modern gpu is BS.
Think because it's a new year with no increase in fidelity/graphical quality...
That all this hardware just "becomes outdated" 😂
I swear the human race is getting dumber with each generation
any difference between using dlssg to fsr mod compared to optiscaler?
For the latter the dxgi.dll hinders me from using reshade packs that use their own dll for that too.
Nvm I am stupid you can choose to use a differently named .dll for optiscaler.