Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The higher you set the settings, the more Graphical memory that it will require. It's using the Cryengine, which I believe made a point to switching completely to dynamic lighting and Global Illumination. Those settings will increase the cost of RAM that it needs.
You're also paying for the price of two uncompressed images the size of your game's video resolution no matter what graphics setting you are on. These are called front and back buffers. At 1920x1080, the cost of these textures will be 1920x1080x2x32bits = 16.6 megabytes rounded up. Not much of a price really. But this is much larger for 4k.
Each shadow casting light also requires some amount of texture memory to be allocated. For Omni (Spherical lights) it needs a total of six textures to form a cube map. For spot lights it needs only one texture.
Then if they are using lightmaps, those require a good deal of texture space.
Lastly comes your high resolution textures. Which might go into the GPU compressed. If it does, they can easily be about 100MB a pop.
Remember that the operating system will often times reserve an amount of memory for it's own functions.
And that you might not actually have 2 gigabytes of memory, because hardware vendors will usually use the metric of 1000MBs = 1 gigabyte, instead of 1024 Megabytes = 1 gigabyte.
I'd be willing to bet they are streaming in textues to memory as well. They start blurry, and quickly crisp out.
A good example of this is Bayonetta, who tells you how much RAM will be used for your current graphical settings. On Ultra, it requires 1873MB, or 1.9 gigabytes of GPU memory.
Simple Explanation: You don't have enough GPU ram. You need to construct more pylons.
To make a short answer.
Your gpu is pretty old, and only 2gb of vram.
This is the reason why some games require a minimum amount of VRam or the game will crash/extremely low performance. Prey used a different concept in unloading certain textures that would greatly exceed the available amount of VRam.
Have you tried playing at a much lower internal resolution to see if it still happens?
Computer Science Graduate, doing game and graphics programming on the side.
Typically what happens when a user runs out of memory, is when the application makes an allocation, and the OS is unable to provide the memory for it, the application will get what is called a null pointer. It's good practice to ensure that this allocation was successful immediately after. If it failed, it's the programmer's responcibility to crash the program as quickly as possible or to handle it differently (yes, most cases a crash is intentional). For games, this gives the game a chance to save the users progress (The reason why they don't do this, is because you're not sure if the area they were in was the cause of the crash. So it's best to frustrate the gamer and send him back a few minutes, than to completely ruin his experience by trapping his save permanently into a crash loop), quickly unload everything from the GPU and the computer's ram (though they don't have to unload the ram, the OS will handle it for them), and to make a crash log which will have some useful debug information that's only useful if you have the source code.
For a GPU, because most of the time you won't be getting data back, so it's perfectly OK to handle the situation differently. So what you do when you load the game up, is you upload your "NULL" objects to the GPU and leave them there permanently.
If you've ever modded a source game and saw purple and black checker board textures, or a big ass flashing word like "MISSING MODEL" or "NULL", those are Source Engine's null objects.
For Cryengine, it looks like their NULL texture is actually a gray texture.
Now, a game engine COULD swap textures out of memory after rendering with them... but this is what will massively drop your frame rates. It's slow to pass anything from the Computer's Ram to the GPU's ram.
A good example of doing this is what Second Life does. Second life is a game of only user driven content. All of the content in second life are made by users.
When you go to a new sim for the first time. You'll usually download the geometry faster than the textures. And because it's an MMO, the server will immediately return control to you after you connected. The world will be gray at first. Players will be represented as clouds, and you can walk around. As you download the data, the textures start appearing, and so do other players.
Told you before bro. Everything costs memory. You don't have enough.
.
Honestly there doesn't seem much of a difference between medium and high. Stuff is a little bit crisper and more detailed, such as your gun(s).
Higher preset increases the games pixel count for certain textures, in most cases you really can't tell much of a difference unless you're playing on a 4k monitor with your eyes looking for discrepencies.
Even though I have a top-end computer that can handle 4k texture/resolutions I tend to play at the high/medium preset in most games to give me a rock solid 120fps at all times (120hz monitor).
I don't know how people with low-mid end computers can play at ultra with 10-30 fps (ie Ark), I much prefer steady performance over a smidge more eye candy. Not to mention inconsistent fps gives me a headache (a real one) and TOO much eye candy tends to strain my eyes a bit too (pretty sunshafts ect).
I know I don't have the most up-to-date machine, but it runs plenty of other recent games just fine.
What is your Vram? Resolution?
SSD or HDD??
What gpu?
Not sure about Vram, but resolution is the 1920 by 1870 or whatever. I just know I'm running on a GeForce 750 Ti and HDD? Like the only trouble I've had with resolution lately is with Arkane games... and it's this same "blurry textures for no reason" thing.