DOOM: The Dark Ages

DOOM: The Dark Ages

I have an RX 6650 XT, meaning that I am capable of running the game...(OUTDATED. REPLACED THE GPU WITH AN RTX 3060)
...but why on God's green earth are you forcing raytracing again? We were telling you this since Indiana Jones dropped the system requirements.
Why is Id Tech forcing so much raytracing and over 100 gigs of storage? Like I meet the criteria, since I built my PC last year, but when the companies tell me that 8 gigs are good for 1080p low and above that it's probably a no go, there's a problem.
Update before someone locks the discussion due to being an old one: I faced the green screen of death with my RX 6650 XT and got me an RTX 3060 12 GB GPU with the money I got back.
Last edited by Schorl; Apr 28 @ 2:14pm
< >
Showing 1-15 of 34 comments
KRON Jan 23 @ 12:00pm 
RTX GPUs came out in 2018. That's over 6 years ago. Just upgrade already.
Schorl Jan 23 @ 12:03pm 
Originally posted by KRON:
RTX GPUs came out in 2018. That's over 6 years ago. Just upgrade already.
Tell me you can't read without telling me you can't read. RX 6000 series came out in late 2020. Just over 4 years ago. You tell me that a 4 year old GPU being the minimum to run the game is okay? What is wrong with you?
Last edited by Schorl; Jan 23 @ 12:06pm
Thiran Jan 23 @ 12:05pm 
Originally posted by KRON:
RTX GPUs came out in 2018. That's over 6 years ago. Just upgrade already.
his gpu can technically run the game since it does support RT. But since it's an entry level card (except the price) it's probably going to run very badly due to the enforced RT without an option to turn it off. Considering nothing in the footage that was released today looked like anything that couldn't be achieved with baked lighting as well it's understandable people are getting annoyed by it. It's a gimmick that came 10 years too early into the "mainstream".
Last edited by Thiran; Jan 23 @ 12:06pm
Centuryt Jan 23 @ 12:05pm 
Originally posted by KRON:
RTX GPUs came out in 2018. That's over 6 years ago. Just upgrade already.
he has upgraded but not to an nvidia
6650xt is a good card it just doesnt work as well as it should because every lazy dev is thinking rt is easier to implement than baked lights
Centuryt Jan 23 @ 12:07pm 
Originally posted by Schorl:
Originally posted by Schorl:
Tell me you can't read without telling me you can't read. RX 6000 series came out in late 2020. Just over 4 years ago. You tell me that a 4 year old GPU being the minimum to run the game is okay? What is wrong with you?
these are the people who turn on pt and think wow shiny lights beautiful idc my fps is 1/8th of what it should be then turn on dlss to get 1/4th the fps they should get
KRON Jan 23 @ 12:08pm 
Originally posted by Schorl:
You tell me that a 4 year old GPU being the minimum to run the game is okay? What is wrong with you?
Yes, I do. Because I remember times when even 4 year old GPUs may not have been supported at all by games.
Schorl Jan 23 @ 12:16pm 
Originally posted by KRON:
Originally posted by Schorl:
You tell me that a 4 year old GPU being the minimum to run the game is okay? What is wrong with you?
Yes, I do. Because I remember times when even 4 year old GPUs may not have been supported at all by games.
Yeah, when 3D games were barely a thing. Times have changed and graphics haven't evolved that much when compared to last gen. Doom Eternal was cross gen let me remind you. And it required 4 gigs minimum. Why should the jump be this high for only 5 years since the last game?
KRON Jan 23 @ 12:23pm 
Originally posted by Schorl:
Originally posted by KRON:
Yes, I do. Because I remember times when even 4 year old GPUs may not have been supported at all by games.
Yeah, when 3D games were barely a thing. Times have changed and graphics haven't evolved that much when compared to last gen. Doom Eternal was cross gen let me remind you. And it required 4 gigs minimum. Why should the jump be this high for only 5 years since the last game?
Crysis 3 released in 2013 didn't support GTX 200 series at all (which was released in 2008-2009), due to being DX11 only. I wasn't talking about the dark ages here.
As for the jump, that's specifically because Eternal was a cross-gen game and The Dark Ages is not.
Last edited by KRON; Jan 23 @ 12:28pm
Why are you shocked game graphics are advancing? Ray-tracing was bound to become the default considering the time its saves developers.

Stop being a stick in the mud and take out a loan for a new graphics card already /s
Schorl Jan 23 @ 12:27pm 
Also, one more thing: 8 cores and 16 threads minimum? And it HAS to be a Ryzen 7? Someone must be really parading around while wearing a clown nose to think that a third generation Ryzen 5 shouldn't cut it. I have a Ryzen 5 4600g and combined with my GPU I get over 100 FPS on Doom Eternal 1080p max. No justification for why my build shouldn't cut it for the Dark Ages. None!
Last edited by Schorl; Feb 9 @ 4:52am
Doom 3 could barely hit 40+ fps at UXGA resolution on Ultra with uncompressed textures, on top of the line bleeding edge hardware at launch. While the ability to rely solely on raster and disable RT would be welcome, I will happily take a game of this fidelity and scope being able to handle 1080p 60Hz on 6 year old now-below-entry-level sub-270 USD GPUs, and 1440p 60Hz at High presets on 5 year old now-considered-mid-range 300 USD GPUs.

Also, while we won't know for a while obviously, and this is a new IdTech iteration... it's still a modern IdTech engine branch. I will be surprised if these specs aren't at least somewhat of an overestimate. So far since IdTech6 these games have run shockingly well on potatoes. We'll see if that's true this time around of course.
Last edited by Defective Dopamine Pez Dispenser; Jan 23 @ 12:33pm
Schorl Jan 23 @ 12:32pm 
Originally posted by KRON:
Originally posted by Schorl:
Yeah, when 3D games were barely a thing. Times have changed and graphics haven't evolved that much when compared to last gen. Doom Eternal was cross gen let me remind you. And it required 4 gigs minimum. Why should the jump be this high for only 5 years since the last game?
Crysis 3 released in 2013 didn't support GTX 200 series at all (which was released in 2008-2009), due to being DX11 only. I wasn't talking about the dark ages here.
Crysis was the most unoptimised series of its time though. Everyone knows that. And I like these games.
KRON Jan 23 @ 12:34pm 
Originally posted by Schorl:
Originally posted by KRON:
Crysis 3 released in 2013 didn't support GTX 200 series at all (which was released in 2008-2009), due to being DX11 only. I wasn't talking about the dark ages here.
Crysis was the most unoptimised series of its time though. Everyone knows that. And I like these games.
They were the most graphically advanced. That's doesn't mean they were unoptimized. At least 2 and 3 weren't. The first game has CPU usage issues, yes.
Schorl Jan 23 @ 12:36pm 
Originally posted by Defective Dopamine Pez Dispenser:
Doom 3 could barely hit 40+ fps at UXGA resolution on Ultra with uncompressed textures, on top of the line bleeding edge hardware at launch. While the ability to rely solely on raster and disable RT would be welcome, I will happily take a game of this fidelity and scope being able to handle 1080p 60Hz on 6 year old now-below-entry-level sub-270 USD GPUs.

Also, while we won't know for a while obviously, and this is a new IdTech iteration... it's still a modern IdTech engine branch. I will be surprised if these specs aren't at least somewhat of an overestimate. So far since IdTech6 these games have run shockingly well on potatoes. We'll see if that's true this time around of course.
I've seen Indiana Jones run decently on benchmarks with my current build, but it drops at 40. And Dark Ages being faster paced, dropping to 40 from 60 is gonna be a mess to get through.
Originally posted by Schorl:
Originally posted by Defective Dopamine Pez Dispenser:
Doom 3 could barely hit 40+ fps at UXGA resolution on Ultra with uncompressed textures, on top of the line bleeding edge hardware at launch. While the ability to rely solely on raster and disable RT would be welcome, I will happily take a game of this fidelity and scope being able to handle 1080p 60Hz on 6 year old now-below-entry-level sub-270 USD GPUs.

Also, while we won't know for a while obviously, and this is a new IdTech iteration... it's still a modern IdTech engine branch. I will be surprised if these specs aren't at least somewhat of an overestimate. So far since IdTech6 these games have run shockingly well on potatoes. We'll see if that's true this time around of course.
I've seen Indiana Jones run decently on benchmarks with my current build, but it drops at 40. And Dark Ages being faster paced, dropping to 40 from 60 is gonna be a mess to get through.

I don't think we can make apples to apples comparisons between the two games. Different devs, different engine (yes, IdTech7 derivative Motor engine in Indy's case, but still,) being tasked with doing very different things. As I said though, we'll have to see what performance actually looks like.

You might intuitively think, "Well, more is going on in this than that, so it'll obviously be heavier," but that's not always how it works. We have to wait and see benches. I also - with all due respect to Machine Games - have a little more faith in Id than them in this regard based on track record. But we'll see. If it's abysmal obviously that's not a good thing, but the specs in and of themselves are reasonable imo.
< >
Showing 1-15 of 34 comments
Per page: 1530 50