Hogwarts Legacy

Hogwarts Legacy

View Stats:
RΣMΣDY Feb 15, 2023 @ 8:35pm
Yea.. they broke the game on nvidia with the recent patch
It literally cut my FPS in half on my 4080 and the game is now borderline unplayable. Even though the patch notes claimed to have increased performance on nvidia GPUs. Pretty frustrating.
< >
Showing 1-4 of 4 comments
-Tex- Feb 15, 2023 @ 8:42pm 
Yeah I've noticed the same with a 4080
WhIteDragem Feb 15, 2023 @ 9:58pm 
yes I am benchmarking an extra GB of VRAM in use when running around Hogsmeade vs 'earlier game versions and drivers'... (AMD user)..
latest game build might use more VRAM (which may streamline and minimise others with SYSTEM bottlenecks) but leave VRAM starved GPUs hurting a little more.

At 1080 resolution it basically sits at 10GB VRAM used. (so 1080 and 'no raytracing' is best option for performance it would appear)
I run at 1440 with everything Ultra (and RT Ultra) and easily use 15.5GB VRAM if playing for fifteen minutes.
I see it alternate between 12.5-14.5GB pretty consistently..

On load I can be 'just under 10GB' (for about a second)
Games WANTS to use what we can give it.

Many people are having a lot of success with making their page file a locked size and 'larger'.
When my framerates kept bombing on latest patch and drivers, it did seem like upping my pagefile from 10GB (locked min and max sizes) to 30GB meant the next frame drop was 'very brief', only went halfway (ie 16 frames per second rather than 6fps) and 'bounced back' super quickly.

Definitely the steaming systems..
A lot of users on borderline CPUs won't understand that their Nvida drivers want CPU over head (that AMD drivers do not), and a cheap ''budget' Solid State Drive,. like a Samsung EVO puts burdens on the CPU for compression and decompression.. (an older EVO that only moves 500Megabytes per second probably doesn't burden a CPU like newer ones that move thousands of megabytes per second) (PRO model SSDs have mini CPUs that do the work for them, budget drives rely on the end users CPU for much of the task.. doesn't look obvious in benchmarks.. but 'open world games' are the hardest job for PCs to do (outside of massive databases and encryption/decryption math...)

open world RPGs tax memory (one of only gaming genres that show FPS improvements from RAM speeds), and require constant data streaming nowadays due to texture sizes etc..

Hogwarts is amazing.
I play at 30fps and it is without doubt the best use of my GPU with regards to gaming.
Sure I can bruteforce every other game at high frames per second and 4K (except Greedfall and Witcher and Portal RTX, where I simply drop resolution)

dropping resolution has 'always been the trick'; as 'someone who ran Crysis on Ultra settings (smoothly) on launch using a mid range GPU.. (8800gts at 35-45 fps).

I get 35-45 in hogwarts (during heaviest moments),.. so I have simply capped at 30 fps and my video card sips power (runs quieter and 'cool) half the time.

30 fps bruteforced (ie no FSR/DLSS etc.. ) has not latency that those techniques add.
I prefer this game at 30fps locked (with no framepacing issues or frame delays to input), than running 'much higher fps'.
With RT set to 'only' High, and if I drop resolution from 1440 to 1080, I could get 60 fps fairly easily, certainly if engaging FSR type modes.. but for the testing of high FPS vs low FPS locked/stable and WITHOUT DELAY, the locked 30'no latency' is the best way to play (by a country mile).

Playstation 5 peaks to higher framerates in 'some cutscenes' (by the looks of it), vs my settings, but then I enjoy full high quality reflections in mirrors.
and the opening ocean shot/cliffs.. RT ultra seems to add a subtle light bounce on cusps of waves and maybe seagulls appear to have 'more depth'.

At 1080 resolution - still canont see 'any pixels'.
this game renders offscreen in a very high quality (high resolution) buffer.. that makes the final screen render look better than 4K in most every other title I have.
as an example Forza Horizon requires high levels of AntiAliasing even at high resolution, to get the jaggies 'under control' (me thinketh the new Fable game will have required A LOT of reworking)..

Hogwarts is incredible, even set to Low quality and 'Low' resolution ... it would look better than just about every RPG I have ever played before.. (gaming since the seventies, RPGs being my fave genre)

graphics bar raised.. just need to educate the consumers that this isn't an old Elder Scrolls title from more than a decade ago, and that 30-60fps is a very useful framerate range for an offline RPG game.. and that high resolution is for 'future systems'. (like gaming was for decades until the last gen consoles froze PC game evolution for SO LONG!)
PocketYoda Feb 15, 2023 @ 10:14pm 
Its fine on my rx580 lol
FruitBat Man Feb 15, 2023 @ 10:15pm 
I use 3080 and I'm OK at 1440P with no Vsync or Dlss
< >
Showing 1-4 of 4 comments
Per page: 1530 50

Date Posted: Feb 15, 2023 @ 8:35pm
Posts: 4