Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Yes, I paid attention. You simply did not understand me. So I will rephrase my statement.
They ask for RTX 3060ti for 1440p60fps no RT
RTX 3070 for 1440p60fps with RT high
and
RTX 4080 for 4K60fps with RT very high.
3070 is 15% faster than 3060ti. 4080 is 94% faster than 3070.
Why do they ask for a 4080 GPU to enjoy a 2-year-old game?
Why RTX 3090 or RTX 3090ti which are also extremely strong GPUs are not good enough?
Do you get my point now?
I wonder how much of that is a genuine system needs that were really tested and how much is propaganda to sell RTX 4080s.
I am sure techpowerup.com will test the game on a wide range of cards, and then we will see. I think that 4080 is overkill for this game, but we will see.
To be fair 4k is a lot more pixels to render than 2k, about 4x more lol. To speak on the 3090Ti, a 4080 is about 20-30% faster depending on benchmark used. You can run the game at max with a 3090Ti, possibly even on a 3090. Maybe a 3080Ti? Though we are talking True 4k no upscaling at 60fps with every setting maxed. With upscaling you can get the same results with lesser hardware.
Not 4, but about 2.2 times more This is the difference in pixels between 1080p and 2160p will be four times
It's not just resolution. The raytracing efficiency on 4XXX series hardware is much improved over that of 3XXX series hardware, just like 3XXX versus the 2XXX hardware of yesteryear. The real question is whether or not the very high/ultra settings will actually be noticeable in-game enough to justify the framerate hit. Versus needing side-by-side uncompressed screen grabs with 500% magnification and a dedicated pixel hunter to notice. Too many games' ultra settings don't justify the framerat hit vs running just one notch down to high and getting a higher more consistent framerate at your chosen resolution.
-A 4090 owner
RTX 4080 RT cores are considerably more powerful than RTX 3080 RT cores. Approximately 50% faster gen on gen.
Also take into account that developer's system requirements are usually inaccurate. You generally need more GPU power than they quote.
You're also asking, "So they ask gamers for 3070 for "amazing RT" ...but if you want to play with "ultimate RT" you need 4080??? Why such a *HUGE* gap in requirements?"
It's going from 1440p high RT to 4K very high RT. Why are you even confused by that? You're going to need way more RT power just to jump from 1440p to 4K. Added to that the RT setting is higher in the 4K recommended specs.
It's because raytracing High offers raytraced reflections
Raytracing Very High offers reflections in a higher resolution AND additional raytraced shadows and raytraced ambient occlusion
Therefore High MINIMUM 3070 (4k@30), Very High MINIMUM 4080 (4k@60)
NIXXES gives fully understandable informations
SELF-EXPLANATORY
Remember specs sheet is one thing and in game feel is one thing .
The game is made to scale with hardware and future hardware to .
We are entering era of full console port to pc with scalable graphics for future graphic hardware .
Either way, if they wanted to squeeze the absolute most out of today's available hardware, specing the ultimate settings at 4K to a current gen GPU doesn't sound too crazy to me. I am glad people get to utilize their new toys. Even if it does seemingly come out of a left field, I am sure previous gen GPUs will be doing just fine as well, if you have to dile down a setting or two from "Ultra" to "very high", you'll have a hard time finding differences anyway.
I remember how an insanely great card like a 1080Ti couldnt run AssCreed Origins at 4k 60, yet Gen 8 consoles did it at 30fps. 1080Tis ran circles around Jaguar APUs and is still an amazing card to this day.
This round "the gamers" have already been conditioned with the latest unoptimized/DRM riddled releases like Jedi Survivor and Hogwarts, which are technically and visually worse than games like RDR2 (not mention far less processes happening in the world) - yet are far more demanding.
Somehow companies suddenly needed massive resources for DRM and forgot how to make games look great without RT.