Monster Hunter Wilds

Monster Hunter Wilds

View Stats:
DLSS 4
I heard you can now force games to use DLSS 4, does it give better performance/image quality. Is it just replacing the dlls into the game folder like usual to get it to work?
< >
Showing 1-15 of 22 comments
Gaidax Mar 21 @ 4:39pm 
You can either do it in the Nvidia app, or DLSS Switcher, or Nvidia Profile Inspector.

Choose your poison.
DLSS Swapper is as easy as a couple mouse clicks. For sure recommend it over the other options.

Using swapper, the version you want to swap in is 310.2.1

Note that DLSS 4 looks WAY better but also gives less of performance boost so you usually want to drop quality down to like Balanced. Even when lowering the quality it still looks better cause DLSS 4 is so much better visually.
Last edited by Honorable_D; Mar 21 @ 4:49pm
Originally posted by Honorable_D:
DLSS Swapper is as easy as a couple mouse clicks. For sure recommend it over the other options.

Using swapper, the version you want to swap in is 310.2.1

Note that DLSS 4 looks WAY better but also gives less of performance boost so you usually want to drop quality down to like Balanced. Even when lowering the quality it still looks better cause DLSS 4 is so much better visually.

got it, so If I wanted to use quality preset at 4K I would actually be getting worse performance? how much worse?
is using the latest frame gen .dll also advised?
Gaidax Mar 21 @ 4:57pm 
Originally posted by ΜΣ†ΛĿ:
got it, so If I wanted to use quality preset at 4K I would actually be getting worse performance? how much worse?

You don't get worse performance; you simply do not get as much of the performance increase, you'd get with DLSS3 of similar quality level.

DLSS4 is better on visuals, but at a cost of less performance increase at the same quality level.
Honorable_D Mar 21 @ 4:57pm 
From what I read DLSS4 w/o Frame gen is about 10% frame loss compared to DLSS3 versions.

WITH frame gen, meaning you need a 4000 series or higher, it becomes +5% better FPS. So the key here seems to be enabling frame gen. I have 3090 so I can't use Nvidia's frame gen and instead use Lossless Scaling's framegen.
thank you, I have a 4090 so I'm expecting 5% better performance and better image.
Jewce86 Mar 21 @ 5:25pm 
see i dont get this, why should a 4090 user have to use ANY dlss to get good performance at NATIVE resolution. It blows my mind, because i swear DLSS was first thrown around as a way for older cards to render 4k at a decent clip. Now its the crutch that all nVidia cards use to get any performance. Ai the world I guess. Im not being at all angry at OP i just dont get this dogwater optimisation.
Gaidax Mar 21 @ 5:43pm 
Originally posted by Jewce86:
see i dont get this, why should a 4090 user have to use ANY dlss to get good performance at NATIVE resolution. It blows my mind, because i swear DLSS was first thrown around as a way for older cards to render 4k at a decent clip. Now its the crutch that all nVidia cards use to get any performance. Ai the world I guess. Im not being at all angry at OP i just dont get this dogwater optimisation.
Because the games are very detailed and complex nowadays, no matter what various morons may be crying about.

Yes, optimization can be better, it always can be better.

But full-blown multi-ray Ray Tracing, 60 gigs of high-definition textures and high-quality materials and environmental physics effects with massive environments and far drawing distance are not a freebie that is so easy to run.

People forget that even just a decade ago even simple Ray Tracing in a realtime game was science fiction and now you just flick it on and get it with over 60 FPS.
Originally posted by Jewce86:
see i dont get this, why should a 4090 user have to use ANY dlss to get good performance at NATIVE resolution. It blows my mind, because i swear DLSS was first thrown around as a way for older cards to render 4k at a decent clip. Now its the crutch that all nVidia cards use to get any performance. Ai the world I guess. Im not being at all angry at OP i just dont get this dogwater optimisation.
I could play without DLSS at native 4K but I would get drops into the 50's in foliage heavy areas. That's with settings maxed out and RT on high & without FG, with FG I would get 100-130 fps at native 4K.
with DLSS quality (1440p internal render) I can get above 60 fps all the time and around 80 fps average without FG, with FG I could get 130-180 FPS. My CPU is outdated and with frame generation turned on it puts more stress on CPU and I get more frametime stutters. IMO the game plays smoother on my PC without frame generation because I need all the CPU I can get for direct access, FG OFF somewhat eliminates direct storage streaming stutter and frametime spikes) at least on my PC with an outdated CPU. So DLSS let's me have decent performance without frame generation.

This is my result at 4K, DLSS quality, maxed settings, RT, no FG
https://steamcommunity.com/sharedfiles/filedetails/?id=3448031418

This is my result at 4K Native, maxed settings, RT, no FG
https://steamcommunity.com/sharedfiles/filedetails/?id=3448061017
Last edited by ΜΣ†ΛĿ; Mar 21 @ 6:17pm
Lyote Mar 21 @ 6:00pm 
Originally posted by Jewce86:
see i dont get this, why should a 4090 user have to use ANY dlss to get good performance at NATIVE resolution. It blows my mind, because i swear DLSS was first thrown around as a way for older cards to render 4k at a decent clip. Now its the crutch that all nVidia cards use to get any performance. Ai the world I guess. Im not being at all angry at OP i just dont get this dogwater optimisation.

You are correct, a 4090 should be able to do native 4k at 60+ fps.
Hell, my 9070xt can almost manage it (hovers between 50 and 60 depending on locale), so a 4090 definitely should.

Only reason I can think of to want to use to use upscaling on a 4090 at 4k would be if he had a 144hz 4k monitor or something and wanted to push the framerate closer to the refresh rate.

DLSS and Frame Gen are absolutely being used as crutches by devs though.
IIRC, Remnant II left all the optimisation to the baked in UE5 stuff and DLSS.
And it ran wor see than Wilds presently does XD

Figures that a useful tool for low end PCs to get better image quality ended up abused as a crutch to not get games running well at native resolutions.
Got it working, honestly I don't see any quality difference, maybe it can only be seen at lower DLSS quality settings? I ran the benchmarks too and did get slightly worse performance compared to DLSS 3 but only 2-3 FPS less. possibly DLSS quality looks a bit closer to 4K native.

which settings would you guys play at.
Base settings: Ultra, RT high, 4K resolution.
A: DLSS Quality, FG OFF - average 80 FPS with dips to low 60's (lock to 60 FPS possible)
B: DLSS Quality, FG ON - average 130-160 FPS with dips into 100's
C: DLAA, FG ON - average 100-130 FPS with dips into 80's

all these options look the same to me graphics quality wise. also playing it on a 240hz oled
Last edited by ΜΣ†ΛĿ; Mar 21 @ 9:22pm
Originally posted by ΜΣ†ΛĿ:
DLSS 4
I heard you can now force games to use DLSS 4, does it give better performance/image quality. Is it just replacing the dlls into the game folder like usual to get it to work?

Sort of.

Originally posted by ΜΣ†ΛĿ:
Got it working, honestly I don't see any quality difference, maybe it can only be seen at lower DLSS quality settings? I ran the benchmarks too and did get slightly worse performance compared to DLSS 3 but only 2-3 FPS less. possibly DLSS quality looks a bit closer to 4K native.

which settings would you guys play at.
Base settings: Ultra, RT high, 4K resolution.
A: DLSS Quality, FG OFF - average 80 FPS with dips to low 60's (lock to 60 FPS possible)
B: DLSS Quality, FG ON - average 130-160 FPS with dips into 100's
C: DLAA, FG ON - average 100-130 FPS with dips into 80's

all these options look the same to me graphics quality wise. also playing it on a 240hz oled

It really depends on the settings and your resolution. Personally I think it is overhyped for most players, as in most players do not even understand how it works, so it does not matter much.

Make sure not to forget to delete the shader cache before testing, otherwise your results might be a tad off.
Foe Mar 21 @ 11:29pm 
For me it changed nothing but crashed my pc for the first time ever.
TuskSpin Mar 22 @ 12:24am 
Originally posted by Jewce86:
see i dont get this, why should a 4090 user have to use ANY dlss to get good performance at NATIVE resolution. It blows my mind, because i swear DLSS was first thrown around as a way for older cards to render 4k at a decent clip. Now its the crutch that all nVidia cards use to get any performance. Ai the world I guess. Im not being at all angry at OP i just dont get this dogwater optimisation.
Even the consoles use upscaling in this game (and from 720p no less) Devs simply refuse to optimize
TuskSpin Mar 22 @ 12:27am 
Updated through the nvidia app and it's definitely a noticeable difference using DLAA at 1440p, though there is a slightly bigger performance hit on 40 series gpus, and it's absolutely terrible on anything prior
< >
Showing 1-15 of 22 comments
Per page: 1530 50

Date Posted: Mar 21 @ 4:36pm
Posts: 22