Hogwarts Legacy

Hogwarts Legacy

View Stats:
Zenmue Feb 17, 2023 @ 9:37am
3080 10Gb?
Without RT everything ultra 1440p dlss quality and dlss is awesome. With RT its only good on high and RT medium dlss quality,anything above stutters. Will be RT performance improved or 10Gb is not enough anymore? or 12GB 3080 the same?
< >
Showing 1-15 of 42 comments
margalus Feb 17, 2023 @ 9:42am 
10GB isn't enough and RT is broken
Perverius Feb 17, 2023 @ 9:43am 
Originally posted by Zenmue:
Without RT everything ultra 1440p dlss quality and dlss is awesome. With RT its only good on high and RT medium dlss quality,anything above stutters. Will be RT performance improved or 10Gb is not enough anymore? or 12GB 3080 the same?

Don't have any 3080 but can say that maxed out 1440p uses around 12GB VRAM on 4080 and I get only occasional stutter that is very likely CPU or engine related. DF did video on this and IIRC they said 10GB is not cutting it.
Dobke Feb 17, 2023 @ 9:44am 
It's well known that Unreal Engine handles ray tracing very poorly.
Zenmue Feb 17, 2023 @ 10:59am 
Yeah but RT looks so good and its still not perfectly smooth high settings and rt medium, without RT, ultra is very crisp and fluent. but with my HDR monitor RT is just so nice. would be good some improvement
MugHug Feb 17, 2023 @ 11:07am 
Originally posted by Zenmue:
Yeah but RT looks so good and its still not perfectly smooth high settings and rt medium, without RT, ultra is very crisp and fluent. but with my HDR monitor RT is just so nice. would be good some improvement

The RT in this game is not the best really and not having it on does not ruin the looks.

With only 10GB of VRAM, it does not bode well for your experience if you turn on RT. Looks like it cuts your FPS in half.

https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/
Lemiru Feb 17, 2023 @ 11:13am 
Yea, RT in this game is quite lacking. Even on ultra mirror reflections are incredibly blurry and some light sources are not included in RT shadows generation
Supgad Feb 17, 2023 @ 11:16am 
I can count on one hand how many games have implemented raytracing well. If raytracing performance is your litmus test for whether or not you'll enjoy a particular title, I assume you'll be disappointed with the overwhelming majority of titles.

Truly a technology that came before its time.
Tommysonic5 Feb 17, 2023 @ 11:20am 
Haven't checked post-patch, but my 10GB 3080 uses about 91% VRAM max, on 4K@60 DLSS Performance (1080p render), texture settings maxed (RT on or off). I think 10GB should be enough unless you're going for 1440p or 4K native.
Performance for me is really bad after last patch though, probably would be best to wait for them to optimize the game.
Last edited by Tommysonic5; Feb 17, 2023 @ 11:21am
MugHug Feb 17, 2023 @ 11:24am 
Originally posted by Tommysonic5:
Haven't checked post-patch, but my 10GB 3080 uses about 91% VRAM max, on 4K@60 DLSS Performance (1080p render), texture settings maxed (RT on or off). I think 10GB should be enough unless you're going for 1440p or 4K native.
Performance for me is really bad after last patch though, probably would be best to wait for them to optimize the game.

Seems to agree with you.
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/
Zenmue Feb 17, 2023 @ 11:27am 
Originally posted by Eiswolfin:
Originally posted by Zenmue:
Without RT everything ultra 1440p dlss quality and dlss is awesome. With RT its only good on high and RT medium dlss quality,anything above stutters. Will be RT performance improved or 10Gb is not enough anymore? or 12GB 3080 the same?

I have the 12GB version.
1440p
ultra settings
High settings on RT

I get no stutters
Using Afterburner, I did see the game use up to 10.8 GB of video memory.

Lucky You maybe i buy a 4080 lol. My CPU and mem are enough 9900K and 32gb. Some told here RT broken and other things but it straight up looks better I think and Im happy to trade off even ultra sharpness and fluency, just a tad bit optimization please :D lol 10gb Gpu mem ahh
Depends on what resolution you play at, and with all the bells n whistles 10GB isn't enough. DLSS sucks as i prefer to play in native resolution, and the game needs more optimization. Game runs fine with the current settings i have @ 3440x1440P, but doesn't mean it doesn't still need more optimization.

i9-9900KF
EVGA 2080 Super hybrid
32GB DDR4 trident OCed
Tommysonic5 Feb 17, 2023 @ 11:34am 
Originally posted by MugHug:
Originally posted by Tommysonic5:
Haven't checked post-patch, but my 10GB 3080 uses about 91% VRAM max, on 4K@60 DLSS Performance (1080p render), texture settings maxed (RT on or off). I think 10GB should be enough unless you're going for 1440p or 4K native.
Performance for me is really bad after last patch though, probably would be best to wait for them to optimize the game.

Seems to agree with you.
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/
Oh interesting, 9838MB for 1080p RT off, and in their case 1080p + RT uses 14143 MB.
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html

I remember a German reviewer GameStar mentioned an early RT patch that was supposed to come around launch. I wonder if that is still being worked on.
Last edited by Tommysonic5; Feb 17, 2023 @ 11:34am
TimmyP Feb 17, 2023 @ 11:47am 
This is the fix for 3070. Ultra\Ultra RTX, DLSSp (for now), 1440p

Just add this to ini, and only this:

[SystemSettings]
r.Streaming.PoolSize=2048

[/script/engine.renderersettings]
r.Streaming.HLODStrategy=0

For other cards adjust poolsize. These lines are the only ones that do anything at least tangible from the other ini posts.

*Thisll prolly work fine as is on 3080 10gb. Maybe set poolsize 512 higher.
Last edited by TimmyP; Feb 17, 2023 @ 11:49am
Kizuna Dragon Feb 17, 2023 @ 11:51am 
Originally posted by Perverius:
Originally posted by Zenmue:
Without RT everything ultra 1440p dlss quality and dlss is awesome. With RT its only good on high and RT medium dlss quality,anything above stutters. Will be RT performance improved or 10Gb is not enough anymore? or 12GB 3080 the same?

Don't have any 3080 but can say that maxed out 1440p uses around 12GB VRAM on 4080 and I get only occasional stutter that is very likely CPU or engine related. DF did video on this and IIRC they said 10GB is not cutting it.

there is a difference between how much a gmae allocates, and how much it actually uses. Your average game "USES" about 4-6 GB Vram. it will allocate much more however.
TimmyP Feb 17, 2023 @ 12:10pm 
Its resource management. Not amount of Vram. Look at my last post it will fix it. Ive been testing since the beginning and know when and where to look for stutters!

*What would it say about pc architecture? PS5 Gpu is equal to a 2070x. It has 16gb of shared vram\memory. PC cant handle that, just because 8gb vram buffer? We arent even close to that scenario yet.
Last edited by TimmyP; Feb 17, 2023 @ 12:13pm
< >
Showing 1-15 of 42 comments
Per page: 1530 50

Date Posted: Feb 17, 2023 @ 9:37am
Posts: 42