Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Don't have any 3080 but can say that maxed out 1440p uses around 12GB VRAM on 4080 and I get only occasional stutter that is very likely CPU or engine related. DF did video on this and IIRC they said 10GB is not cutting it.
The RT in this game is not the best really and not having it on does not ruin the looks.
With only 10GB of VRAM, it does not bode well for your experience if you turn on RT. Looks like it cuts your FPS in half.
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/
Truly a technology that came before its time.
Performance for me is really bad after last patch though, probably would be best to wait for them to optimize the game.
Seems to agree with you.
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/
Lucky You maybe i buy a 4080 lol. My CPU and mem are enough 9900K and 32gb. Some told here RT broken and other things but it straight up looks better I think and Im happy to trade off even ultra sharpness and fluency, just a tad bit optimization please :D lol 10gb Gpu mem ahh
i9-9900KF
EVGA 2080 Super hybrid
32GB DDR4 trident OCed
https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html
I remember a German reviewer GameStar mentioned an early RT patch that was supposed to come around launch. I wonder if that is still being worked on.
Just add this to ini, and only this:
[SystemSettings]
r.Streaming.PoolSize=2048
[/script/engine.renderersettings]
r.Streaming.HLODStrategy=0
For other cards adjust poolsize. These lines are the only ones that do anything at least tangible from the other ini posts.
*Thisll prolly work fine as is on 3080 10gb. Maybe set poolsize 512 higher.
there is a difference between how much a gmae allocates, and how much it actually uses. Your average game "USES" about 4-6 GB Vram. it will allocate much more however.
*What would it say about pc architecture? PS5 Gpu is equal to a 2070x. It has 16gb of shared vram\memory. PC cant handle that, just because 8gb vram buffer? We arent even close to that scenario yet.