Hogwarts Legacy

Hogwarts Legacy

View Stats:
airbus1221 Apr 7, 2023 @ 6:58am
Why this game does not use shared gpu memory?
I have a 3070 and 32GB of RAM. I know the graphics card only has 8GB of VRAM, so I experience stuttering in Hogsmeade due to the lack of VRAM. However, I wonder why this game does not use shared GPU memory. i play the game at 1440p
< >
Showing 1-15 of 23 comments
Psyringe Apr 7, 2023 @ 8:02am 
Hmm, I'm not sure what you're referring to. "Shared memory" (as far as I understand the term) is a hardware thing, it's not something that a game can decide to use or not.

On current-gen consoles and most laptops, the CPU and the graphics card/chip both use the same memory. This is called "shared memory".

On desktop PCs, graphics cards come with their own memory directly on the card, while CPUs use the system memory, i.e. RAM sticks that are slotted into the mainboard.

The reason why RAM and VRAM are separate, is that graphics cards need to process a lot of data very quickly. Having dedicated memory on the graphics card makes it much faster. If your graphics card would use the system memory, it would be very slow. Modern consoles and laptops get around this limitation by connecting their graphics chips to the system memory in a different (faster) way.

That said, even on desktops, graphics data is shared between VRAM and RAM all the time. Before any graphics data can be transferred into VRAM, it first must be loaded into RAM and get decompressed by the CPU. The graphics card loads data from RAM into VRAM when it needs it, and deletes data from VRAM when it runs out of space. On desktops, "sharing" memory this way is a slow process which can causes stutters when the graphics card doesn' have data in VRAM and needs to pull it from RAM first.
airbus1221 Apr 7, 2023 @ 5:56pm 
Originally posted by Psyringe:
Hmm, I'm not sure what you're referring to. "Shared memory" (as far as I understand the term) is a hardware thing, it's not something that a game can decide to use or not.

On current-gen consoles and most laptops, the CPU and the graphics card/chip both use the same memory. This is called "shared memory".

On desktop PCs, graphics cards come with their own memory directly on the card, while CPUs use the system memory, i.e. RAM sticks that are slotted into the mainboard.

The reason why RAM and VRAM are separate, is that graphics cards need to process a lot of data very quickly. Having dedicated memory on the graphics card makes it much faster. If your graphics card would use the system memory, it would be very slow. Modern consoles and laptops get around this limitation by connecting their graphics chips to the system memory in a different (faster) way.

That said, even on desktops, graphics data is shared between VRAM and RAM all the time. Before any graphics data can be transferred into VRAM, it first must be loaded into RAM and get decompressed by the CPU. The graphics card loads data from RAM into VRAM when it needs it, and deletes data from VRAM when it runs out of space. On desktops, "sharing" memory this way is a slow process which can causes stutters when the graphics card doesn' have data in VRAM and needs to pull it from RAM first.

Thank you for your helpful comment! If you don't mind, I have a question. If a game can't decide to use shared GPU memory or not, why does "The Last of Us Part 1" on PC use shared GPU memory? I noticed that the game uses up to 14GB of GPU memory at 1440p with ultra settings, with 7-8GB of VRAM and 6-7GB of shared GPU memory, i guess the RAM usage is about 20GB. Although shared memory is slower than VRAM, I played "The Last of Us" flawlessly without any noticeable problems due to a lack of VRAM. I believe "The Last of Us" uses more VRAM and RAM than "Hogwarts Legacy." What could be the reason behind this?
Psyringe Apr 7, 2023 @ 6:16pm 
Originally posted by airbus1221:
Thank you for your helpful comment! If you don't mind, I have a question. If a game can't decide to use shared GPU memory or not, why does "The Last of Us Part 1" on PC use shared GPU memory? I noticed that the game uses up to 14GB of GPU memory at 1440p with ultra settings, with 7-8GB of VRAM and 6-7GB of shared GPU memory, i guess the RAM usage is about 20GB. Although shared memory is slower than VRAM, I played "The Last of Us" flawlessly without any noticeable problems due to a lack of VRAM. I believe "The Last of Us" uses more VRAM and RAM than "Hogwarts Legacy." What could be the reason behind this?
I'm sorry, but this doesn't make sense. You still seem to be using the term "shared memory" in a different way than anyone else, and you're not providing enough data to clarify what you mean.

Could you explain why exactly you think that TLOU "uses shared memory" and Hogwarts Legacy doesn't? Please provide the respective numbers for both titles, and please explain how you measured them.
airbus1221 Apr 7, 2023 @ 6:24pm 
Originally posted by Psyringe:
Originally posted by airbus1221:
Thank you for your helpful comment! If you don't mind, I have a question. If a game can't decide to use shared GPU memory or not, why does "The Last of Us Part 1" on PC use shared GPU memory? I noticed that the game uses up to 14GB of GPU memory at 1440p with ultra settings, with 7-8GB of VRAM and 6-7GB of shared GPU memory, i guess the RAM usage is about 20GB. Although shared memory is slower than VRAM, I played "The Last of Us" flawlessly without any noticeable problems due to a lack of VRAM. I believe "The Last of Us" uses more VRAM and RAM than "Hogwarts Legacy." What could be the reason behind this?
I'm sorry, but this doesn't make sense. You still seem to be using the term "shared memory" in a different way than anyone else, and you're not providing enough data to clarify what you mean.

Could you explain why exactly you think that TLOU "uses shared memory" and Hogwarts Legacy doesn't? Please provide the respective numbers for both titles, and please explain how you measured them.

i am outside now so i can't check right now. but you can find and check "shared gpu memory" at "task manager"
Psyringe Apr 7, 2023 @ 7:07pm 
Originally posted by airbus1221:
you can find and check "shared gpu memory" at "task manager"
Thanks for the explanation. My apologies, I wasn't aware that this metric was added to the task manager a while ago. If it's as reliable as Microsoft claims, then this ought to be very useful.

The shared memory that is being tracked there, is a portion of your system memory that is reserved for situations when the graphics card runs out of VRAM.

So you're saying that in your experience, Hogwarts Legacy does not use this type of shared memory at all? That might indeed warrant further investigation.
RustyNail Apr 7, 2023 @ 7:28pm 
Been a LONG TIME since I did any low level graphics programming; and that was all done with the CPU operating in "real mode". So yeah, we're talking a VERY LONG time ago.

Common meaning of shared RAM is simply a block of system memory that can be accessed by several processors (CPU and GPU for instance). It today's world with virtual memory, and whatnot - things have likely progressed a lot further than my horizon.

Even in the "old days" a GPU could use direct memory access to grab the contents of some portion of the shared memory and move a copy to VRAM without going thru the CPU.

As to why one application might use shared memory, while a different app doesn't - that choice is made by the developers depending on what they need/want.
Last edited by RustyNail; Apr 7, 2023 @ 7:28pm
airbus1221 Apr 7, 2023 @ 7:35pm 
Originally posted by Psyringe:
So you're saying that in your experience, Hogwarts Legacy does not use this type of shared memory at all? That might indeed warrant further investigation.

Yes. In my experience, the game rarely utilizes shared GPU memory. if i'm right about this problem, would it be possible for the developers to address this issue with a new patch?
airbus1221 Apr 7, 2023 @ 7:54pm 
Originally posted by RustyNail:
As to why one application might use shared memory, while a different app doesn't - that choice is made by the developers depending on what they need/want.

Thank you for your comment!! So can they fix it with a new patch?
Psyringe Apr 7, 2023 @ 9:00pm 
Originally posted by airbus1221:
Originally posted by Psyringe:
So you're saying that in your experience, Hogwarts Legacy does not use this type of shared memory at all? That might indeed warrant further investigation.

Yes. In my experience, the game rarely utilizes shared GPU memory. if i'm right about this problem, would it be possible for the developers to address this issue with a new patch?
Theoretically yes, but a) we don't know how difficult such a change would be to implement with the codebase that the devs currently have, b) we don't know if the devs would be willing to do it, and c) we don't know if (and by how much) this would actually affect performance.

An expert might be able to make educated guesses about those 3 points, but this definitely goes beyond my area of expertise. The chance to meet such an expert on a Steam forum is probably pretty low.

Try to ask a reputable tech/game reviewer who has experience with Hogwarts Legacy as well as other recent triple-A releases. Alex Battaglia from Digital Foundry would be my first choice, since he also knows the intricacies of Unreal Engine 4 quite well. Otherwise, outlets like Hardware Unboxed, Gamers Nexus, or Brad Chacos from PC World are very competent when it comes to graphics, but they may not have the same expertise when it comes to coding.
Last edited by Psyringe; Apr 7, 2023 @ 9:01pm
RustyNail Apr 7, 2023 @ 9:22pm 
Originally posted by airbus1221:
Originally posted by RustyNail:
As to why one application might use shared memory, while a different app doesn't - that choice is made by the developers depending on what they need/want.

Thank you for your comment!! So can they fix it with a new patch?

I can't address that question. Sorry.
v.aurimas91 Apr 7, 2023 @ 10:48pm 
So, you're saying that this game does not use what excess memory is needed from the installed RAMs? Interesting... that might explain the micro-stutters and random lags.
airbus1221 Apr 8, 2023 @ 12:30am 
Originally posted by v.aurimas91:
So, you're saying that this game does not use what excess memory is needed from the installed RAMs? Interesting... that might explain the micro-stutters and random lags.
Yes, exactly
joridiculous Apr 8, 2023 @ 1:28am 
Originally posted by airbus1221:
Originally posted by v.aurimas91:
So, you're saying that this game does not use what excess memory is needed from the installed RAMs? Interesting... that might explain the micro-stutters and random lags.
Yes, exactly
The games use a lot of RAM, but not that much Vram.
On my system the game hovers at ~25GB Ram and ~14GB Vram pretty steadily.
(installed, 32BG Ram, 24GB Vram)
airbus1221 Apr 8, 2023 @ 7:07pm 
f
Wasp Apr 9, 2023 @ 2:32am 
the stuttering is so terrible i refuse to play until they patch this.... been waiting for a long time since the march 8th patch did literally nothing to address this. guess they dont give a damn. diablo 4 is just around the corner so maybe we can just move on and forget this mess.. what a shame the game looked kinda cool.
< >
Showing 1-15 of 23 comments
Per page: 1530 50

Date Posted: Apr 7, 2023 @ 6:58am
Posts: 23