Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
RDR2 is completely different and looks way more stunning than any other videogame except maybe star citizen because it pumps up all the effects to 11.
No that's terrible. Some of us buy nice hardware because we have to have good gameplay experience. I know my rig isn't top of the line anymore but it still is top of the consumer line 1 generation ago and I get around 100 fps in all the latest AAA titles set to ultra at 1440p.
If RDR2 AUTO DETECTS that settings should be on ultra, but the game is running at 30-50 fps then something is wrong with the game.
You just described my system except 64GB of RAM with the i9 9700 so I'm not having any trouble at all.
The game was designed with the future in mind, the hardware doesn't exist to run this game at max setting at 4k 60fps yet.
Where does it say that in game or in the menus?
Why would any game say "hey, soo, these settings can't be ran yet for good quality, the hardware doesn't exist" that's just the way it is.
Someone else posted a good article that was made by people who thoroughly tested every single setting and have came to the conclusion that the game is so advanced graphically that it can't be ran with everything maxed out yet with current hardware and that it was designed for the future in mind. but i doubt you will read it and even if you do, you will still deny the fact the game is indeed optimized well, but in case you do decide to actually read it, here's the link again.
https://www.eurogamer.net/articles/digitalfoundry-2019-what-does-it-take-to-run-red-dead-redemption-2-at-60fps
I'm sure you have tested the game as much as these guys have and are just an expert on optimization.
Seriously though, you guys who bought 1080s and 2080s that are mad you can't just ram ultra on this game like you do every other game are just morons, actually look at the graphics and stop and think what it takes to make a game look this crazy and then think about what it takes to make it look this crazy and still run good.
Don't want to read? heres a video by the same people.
https://www.youtube.com/watch?v=RUrbciPfX6k
The assumption is that any game is designed to work with current generation of hardware. So if a game breaks that convention, there should be some kind of notification saying *hey, some of these settings are experimental and are not designed for current gen hardware*.
I know I've seen it in at least a couple games, but Kingdom Come Deliverance comes to mind.