Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
But RTX 2060 Super is? Why even comment brah.
Stop spamming this thread with nonsense.
What's odd about that is that Indiana Jones TGC lists Intel Arc A580, and DA is by all accounts using the same engine. Maybe it'll get listed in the future. It'll be interesting to see if it does. Hopefully.
Not quite the same engine. This is IdTech 8, and the first game we've seen using it. Indy is on a branch of IdTech 7 called Motor. But Yeah, I honestly would be surprised if we don't eventually see mention of it. And if nothing else, once benchmarks emerge, I'm sure someone will get to put it through its paces on an Intel card and see where they stand.
You don't even understand what's going on do you? Arc B580 outperforms the RTX 4060 in most cases, so saying DA can run on RTX 2060 Super because it's more powerful but not on Arc B580 is total nonsense. You just commented on something you know nothing about. But thanks for all the jester awards, they are nice.
It's possible they might just not have tested them internally enough to include them in the specs, or don't feel comfortable with the XeSS implementation for the game yet or something. (Which is a point worth considering. We don't know if the use cases in the official specs are native or using superscaling methods. If they're the latter, they might be trying to nail XeSS in the game before listing them as supported.)
Who knows. But yeah, no reason in principle they shouldn't be able to on paper. As I said before, hopefully benches will confirm that before release if we're lucky. Before Eternal and 2016 we got some really nice benchmark suites from various outlets, so hopefully that'll be true this time around. People usually love benchmarking new IdTech branch games extensively because they're usually such good scaling showcases, so I expect we will lol.
Thanks to nvidia, from a very long time gpu drivers are full of game specific hacks, optimizations and workarounds for common mistakes and less than optimal solutions left by lazy game devs.
Due to this, even if 3 different brands would follow same API versions, there might be code in the game which should lead to undefined results or just does something very slow. Nvidia and AMD mostly deal with it, Intel still has to catch up.