DOOM: The Dark Ages

DOOM: The Dark Ages

No mention of Intel® Arc in the GPU requriements?
Kind of odd it isn't listed, when most new AAA games do now, like Stalker 2 and just released Assassin's Creed. You guys think it's cause of concern for support?
< >
Showing 1-13 of 13 comments
󠀡󠀡 Jan 27 @ 6:57pm 
it not powerful enough to run this game
Zapmaster Jan 27 @ 7:02pm 
Originally posted by 󠀡󠀡:
it not powerful enough to run this game

But RTX 2060 Super is? Why even comment brah.
Last edited by Zapmaster; Jan 27 @ 7:13pm
There's no reason in principle they couldn't, so it may simply be that they haven't internally tested them well enough or established driver support they feel confident enough in recommending them in terms of the performance they want to see for their officially listed use cases.
󠀡󠀡 Jan 27 @ 7:13pm 
Originally posted by Zapmaster:
Originally posted by 󠀡󠀡:
it not powerful enough to run this game

But RTX 2060 Super is? Why even comment brah. XD
yup
Zapmaster Jan 27 @ 7:15pm 
Originally posted by 󠀡󠀡:
Originally posted by Zapmaster:

But RTX 2060 Super is? Why even comment brah. XD
yup

Stop spamming this thread with nonsense.
Zapmaster Jan 27 @ 7:18pm 
Originally posted by Defective Dopamine Pez Dispenser:
There's no reason in principle they couldn't, so it may simply be that they haven't internally tested them well enough or established driver support they feel confident enough in recommending them in terms of the performance they want to see for their officially listed use cases.

What's odd about that is that Indiana Jones TGC lists Intel Arc A580, and DA is by all accounts using the same engine. Maybe it'll get listed in the future. It'll be interesting to see if it does. Hopefully.
Last edited by Zapmaster; Jan 27 @ 7:19pm
Originally posted by Zapmaster:
Originally posted by Defective Dopamine Pez Dispenser:
There's no reason in principle they couldn't, so it may simply be that they haven't internally tested them well enough or established driver support they feel confident enough in recommending them in terms of the performance they want to see for their officially listed use cases.

What's odd about that is that Indiana Jones TGC lists Intel Arc A580, and DA is by all accounts using the same engine. Maybe it'll get listed in the future. It'll be interesting to see if it does. Hopefully.

Not quite the same engine. This is IdTech 8, and the first game we've seen using it. Indy is on a branch of IdTech 7 called Motor. But Yeah, I honestly would be surprised if we don't eventually see mention of it. And if nothing else, once benchmarks emerge, I'm sure someone will get to put it through its paces on an Intel card and see where they stand.
Bonus Jan 28 @ 11:55am 
Hopefully if there are launch issues with the card it doesn't take long to patch
󠀡󠀡 Jan 28 @ 5:30pm 
Originally posted by Zapmaster:
Originally posted by 󠀡󠀡:
yup

Stop spamming this thread with nonsense.
Your post is nonsense but you are right, I will stop replying to your nonsense posts or replies even better I'm blocking you so I wont see your nonsense posts in the future.
Originally posted by 󠀡󠀡:
Originally posted by Zapmaster:

Stop spamming this thread with nonsense.
Your post is nonsense but you are right, I will stop replying to your nonsense posts or replies even better I'm blocking you so I wont see your nonsense posts in the future.

You don't even understand what's going on do you? Arc B580 outperforms the RTX 4060 in most cases, so saying DA can run on RTX 2060 Super because it's more powerful but not on Arc B580 is total nonsense. You just commented on something you know nothing about. But thanks for all the jester awards, they are nice.
Last edited by Zapmaster; Jan 28 @ 6:53pm
Yeah, again, there really is no reason in principle the Arc GPUs (above a certain tier at least) shouldn't be capable of running the game, other than API or driver issues. How well is another story, but they have hardware ray tracing acceleration and are reasonably comparable to GPUs in the specs.

It's possible they might just not have tested them internally enough to include them in the specs, or don't feel comfortable with the XeSS implementation for the game yet or something. (Which is a point worth considering. We don't know if the use cases in the official specs are native or using superscaling methods. If they're the latter, they might be trying to nail XeSS in the game before listing them as supported.)

Who knows. But yeah, no reason in principle they shouldn't be able to on paper. As I said before, hopefully benches will confirm that before release if we're lucky. Before Eternal and 2016 we got some really nice benchmark suites from various outlets, so hopefully that'll be true this time around. People usually love benchmarking new IdTech branch games extensively because they're usually such good scaling showcases, so I expect we will lol.
Last edited by Defective Dopamine Pez Dispenser; Jan 28 @ 8:30pm
djcarey Jan 28 @ 9:32pm 
none
Krówka Jan 29 @ 2:31am 
It's all about driver support, intel is in trying hard mode about improving their situation, but it still has problems and it's just safer to not include it in requirements.

Thanks to nvidia, from a very long time gpu drivers are full of game specific hacks, optimizations and workarounds for common mistakes and less than optimal solutions left by lazy game devs.

Due to this, even if 3 different brands would follow same API versions, there might be code in the game which should lead to undefined results or just does something very slow. Nvidia and AMD mostly deal with it, Intel still has to catch up.
< >
Showing 1-13 of 13 comments
Per page: 1530 50

Date Posted: Jan 27 @ 4:31pm
Posts: 13