Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
On current-gen consoles and most laptops, the CPU and the graphics card/chip both use the same memory. This is called "shared memory".
On desktop PCs, graphics cards come with their own memory directly on the card, while CPUs use the system memory, i.e. RAM sticks that are slotted into the mainboard.
The reason why RAM and VRAM are separate, is that graphics cards need to process a lot of data very quickly. Having dedicated memory on the graphics card makes it much faster. If your graphics card would use the system memory, it would be very slow. Modern consoles and laptops get around this limitation by connecting their graphics chips to the system memory in a different (faster) way.
That said, even on desktops, graphics data is shared between VRAM and RAM all the time. Before any graphics data can be transferred into VRAM, it first must be loaded into RAM and get decompressed by the CPU. The graphics card loads data from RAM into VRAM when it needs it, and deletes data from VRAM when it runs out of space. On desktops, "sharing" memory this way is a slow process which can causes stutters when the graphics card doesn' have data in VRAM and needs to pull it from RAM first.
Thank you for your helpful comment! If you don't mind, I have a question. If a game can't decide to use shared GPU memory or not, why does "The Last of Us Part 1" on PC use shared GPU memory? I noticed that the game uses up to 14GB of GPU memory at 1440p with ultra settings, with 7-8GB of VRAM and 6-7GB of shared GPU memory, i guess the RAM usage is about 20GB. Although shared memory is slower than VRAM, I played "The Last of Us" flawlessly without any noticeable problems due to a lack of VRAM. I believe "The Last of Us" uses more VRAM and RAM than "Hogwarts Legacy." What could be the reason behind this?
Could you explain why exactly you think that TLOU "uses shared memory" and Hogwarts Legacy doesn't? Please provide the respective numbers for both titles, and please explain how you measured them.
i am outside now so i can't check right now. but you can find and check "shared gpu memory" at "task manager"
The shared memory that is being tracked there, is a portion of your system memory that is reserved for situations when the graphics card runs out of VRAM.
So you're saying that in your experience, Hogwarts Legacy does not use this type of shared memory at all? That might indeed warrant further investigation.
Common meaning of shared RAM is simply a block of system memory that can be accessed by several processors (CPU and GPU for instance). It today's world with virtual memory, and whatnot - things have likely progressed a lot further than my horizon.
Even in the "old days" a GPU could use direct memory access to grab the contents of some portion of the shared memory and move a copy to VRAM without going thru the CPU.
As to why one application might use shared memory, while a different app doesn't - that choice is made by the developers depending on what they need/want.
Yes. In my experience, the game rarely utilizes shared GPU memory. if i'm right about this problem, would it be possible for the developers to address this issue with a new patch?
Thank you for your comment!! So can they fix it with a new patch?
An expert might be able to make educated guesses about those 3 points, but this definitely goes beyond my area of expertise. The chance to meet such an expert on a Steam forum is probably pretty low.
Try to ask a reputable tech/game reviewer who has experience with Hogwarts Legacy as well as other recent triple-A releases. Alex Battaglia from Digital Foundry would be my first choice, since he also knows the intricacies of Unreal Engine 4 quite well. Otherwise, outlets like Hardware Unboxed, Gamers Nexus, or Brad Chacos from PC World are very competent when it comes to graphics, but they may not have the same expertise when it comes to coding.
I can't address that question. Sorry.
On my system the game hovers at ~25GB Ram and ~14GB Vram pretty steadily.
(installed, 32BG Ram, 24GB Vram)