ELDEN RING

ELDEN RING

View Stats:
What GPU should I get for 4k120fps in Elden Ring?
I played a fair bit of Elden Ring at launch on a 3050 Ti laptop, but by the time Shadow of the Erdtree came out, I decided I didn't want to suffer through more 1080p60fps with frame drops and had gotten a 4k monitor, with the intent of buying a 50-series card when they come out.

Now that the 50-series has been revealed... whoof. Tons of AI buzzwords, but pure raster performance seems to be solely a 20%-30% gen-on-gen uplift. However, Elden Ring, one of the games I've been most eager to play on a new GPU, is almost purely raster performance. If Elden Ring is what I consider a "graphical powerhouse" rather than a more typical choice like Alan Wake 2, what sort of GPU should I be looking at if all I want is the best possible raster performance? Should I get a 50-series regardless, an older gen, or even go AMD?
< >
Showing 1-7 of 7 comments
tfa Jan 7 @ 6:35pm 
No GPU can do 4k 120 with Raytracing. A 5080 should be able to do it without Raytracing . You'll need a good CPU as well
Last edited by tfa; Jan 7 @ 6:36pm
Without raytracing is probably my decision, since in my experience raytracing is only truly impressive in a handful of games. Crystal-clear 4k is far preferable. Is 4k60fps a more realistic target?
An Irate Walrus (Banned) Jan 7 @ 7:10pm 
The game is restricted to 60fps without mods in the first place. You can unlock the framerate with mods, but if you're going vanilla, 4k60 is pretty much it.

Also remember that new card doesn't always mean significant performance jumps. The 50 series of Nvidia cards look beastly, but at those price points, they damn well better be.
Originally posted by An Irate Walrus:
The game is restricted to 60fps without mods in the first place. You can unlock the framerate with mods, but if you're going vanilla, 4k60 is pretty much it.

Also remember that new card doesn't always mean significant performance jumps. The 50 series of Nvidia cards look beastly, but at those price points, they damn well better be.

That's why I've been concerned, because it seems that the 50-series cards are going all-in on "AI" and are only a marginal improvement in pure raster performance. As someone who mostly plays old games such Team Fortress 2, non-raytraced games such as Elden Ring, "AA" games such as Deep Rock Galactic, and strategy games such as Total War Pharaoh, I just want to play everything at 4k60fps (or 120fps, if I can get away with it) without any muss or fuss.
tfa Jan 7 @ 7:31pm 
Originally posted by Crossbreed Priscilla:
Originally posted by An Irate Walrus:
The game is restricted to 60fps without mods in the first place. You can unlock the framerate with mods, but if you're going vanilla, 4k60 is pretty much it.

Also remember that new card doesn't always mean significant performance jumps. The 50 series of Nvidia cards look beastly, but at those price points, they damn well better be.

That's why I've been concerned, because it seems that the 50-series cards are going all-in on "AI" and are only a marginal improvement in pure raster performance. As someone who mostly plays old games such Team Fortress 2, non-raytraced games such as Elden Ring, "AA" games such as Deep Rock Galactic, and strategy games such as Total War Pharaoh, I just want to play everything at 4k60fps (or 120fps, if I can get away with it) without any muss or fuss.
The 4080 can do about 100+ fps which is almost indistinguishable from 120 if you have a vrr display. 5080 appears to be about 20 % faster than 4080. Although I would worry about 16 gb vram long term. it may not be enough for 4k max settings in a few years. Plenty of performance videos with unlocked fps on youtube.
Originally posted by Crossbreed Priscilla:
That's why I've been concerned, because it seems that the 50-series cards are going all-in on "AI" and are only a marginal improvement in pure raster performance.
Most Ai stuff you can run locally does not even use the dedicated AI-hardware and just the regular ol' CUDA-cores. The slides shown at CES are also pure BS, to the point where i feel like it must surely violate a law. Like they say the 50-series is 2x faster with AI by comparing 2 different AI image-gen models with a heavier version running on a slower 40-series card. I'm impressed at just how stupid they think people are. But then again, people will probably still buy + CONSUME.

But to give you an idea just how shameless their comparisons are: It's like comparing a game running at 1080p on a 50-series card to a game running at 1440p on it's 40-series equivalent and saying "Look how much more FPS the 50-series gets!". It's even more blatant than what they already do with comparing native res and native framerate with upscaled + framegen.

it's a downright grift with the AI-buzzword used as an upsell. Just try and get a used higher end 40XX-series card or AMD equivalent, if possible.

The game's locked at 60FPS and 4K60 should be perfectly possible on something like a 4070 or 6800XT from AMD. You'll more likely run into memory or CPU bottlenecks, because ER just isn't that well optimized in that regard.
Unless you want to use RT, then you'd definitely get GPU limited at 4K. RT is only used for shadows and AO and haphazardly tacked on, so that Bamco is able to use the "Ray-Tracing" buzzword. Shadows sure look nicer, but it's just not worth the performance hit imo. There's are reason why games with RT also come with features, like FSR or DLSS. it's just that computationally intensive to run in real-time.
Last edited by JellyPuff; Jan 7 @ 9:14pm
And that's the crux of it: I don't care for the "AI" stuff nearly as much as I thought I would because I realized that the majority of games I play have light or no implementations of raytracing, while a lot of "next-gen" games just use upscaling and frame gen to cover up poor optimization. Having perfectly realistic lighting at the cost of half my performance is not as important as being able to see the rust on the kopeshes of my little Egyptian dudes in Total War: Pharaoh. For me, 4k60fps with maxed conventional graphics and no raytracing or upscaling is ideal.

Funnily enough, I actually use upscaling far more when playing old games that playing new ones. An app called Lossless Scaling allows upscaling and frame gen to be applied to almost any game, which is perfect for older games like Resident Evil Revelations that do not natively support 4k. Probably my biggest bugbear these days is older games that "support" 4k but do not have proper UI scaling, like Total War: Shogun 2, forcing me to use upscaling as a workaround.
https://store.steampowered.com/app/993090/Lossless_Scaling/
Last edited by Dark Sun Gwyndolin; Jan 8 @ 5:27am
< >
Showing 1-7 of 7 comments
Per page: 1530 50

Date Posted: Jan 7 @ 6:25pm
Posts: 7