Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
4070Ti is very powerful but has weirdly low memory for that performance.
last year was the first time 8gb vram became the minimum demand for SOME new releases.
this despite that every card in the 3xxx and 4xxx line up had 8gb vram or more..
and that all but the 2060 card in the 2xxx series had too
so since 2019.. almost every card had 8gb and only now they stser making it the requirement
before that it was 6gb and what you know.. the 1060 had 6gb..
not until therr have been 2 series released that have more than 8gb vram games will start demanding more .
the 4070 has 8gb.. so this series does not count..
so likely around the 6xxx series carss games will start demanding 12gb vram.
gpus series relese every 1.5 to 2 years.
and the 5xxx series is due autum this year.
so expect games to start demanding 12gb around december 2026..
until than 8gb will do.
them demanding 16gb.. likely is even further away..
you could allocate 12+g to the intel uhd or amd igpu
it does not make it any stronger
dedicated gpus have enough vram for their gpu core to handle
you could also look at the quadro or radeon workstation gpus, with insane amounts of ram
192g on the AMD Instinct MI300X
its junk for games, but great for ai, rendering and video editing tasks
Currently there are only 1 or 2 games which need more than 12GB VRAM when running at 4k maximum setting with ultra texture.
12 GB vram should be totally fine for 1440p gaming for at least next couple of years easily. And after that, you can always lower few graphics settings to work.
A 12 GB card should easily last for next 5 years without any major issue. Throughout this ps5 console generation at least, developers won't change their games anything drastic because there games need to work on ps5 as well which has 16 GB combined memory in total.
Yeah I heard something about PS5 can only realistically use 12gb for games and the other 4gb is used for the OS etc.
same ram space is used for its system and gpu
cpu can write directly to ram, and allocate it to vram to speed things up some
there is on way to directly compare console and desktop performance
Consoles don't work differently.
Console works like PC with integrated iGPU where a part of system memory works as VRAM, in this case GDDR6.
Consoles are nothing but PC with different OS and DRM chip. Console can't do anything groundbreaking which a PC can't.
The console's CPU can utilize super-fast VRAM as system RAM, which is a unique feature not yet available on PCs. The PS5 includes an additional 512MB of DDR4 for the operating system, but for some reason, consoles still prefer to use the much more expensive GDDR6 for that.
Yes. Consoles have the advantage of RAM speed, not a new groundbreaking technology or something.
Speed don't compensate the amount of memory a console has.
At the end of the day it's still 16 GB in total memory, and a game developer need to create their game according to it. Which is the main subject of this thread and OP was asking.
consoles do it on a hardware level
the ps2 has a hardware video decoder for dvd, that some games use to decompress game code so it could read compressed data off the disc, decode with the video encoder and then put to system ram and run from there
no pc can do that on hardware level, and its part of why emulating the ps2 takes so much cpu to do, even tho the ps2 hardware is extremely slow in comparison
part of why consoles with inferior hardware can keep up with pc
lighter more optimized os, written for the exact same specific hardware, and can use tricks like that to speed things up, along with dynamic res scaling, rendering at different res depending on how long it would take to draw frames to hold fps up by lowering res for specific parts/layers
If you look online, some sources say high end gpu/s should be able to still play most games at low-end settings in 10 years or so. That's an extremely general and broad statement but gives you an idea, in case some with 4090 or AMD equivalent were wondering.
Edit: I have a 12 GB card also and it's overkill for my purposes. But most if not all game devs are very much aware of general hardware trends. They want to please the most birds with one stone.
"Consoles" do work differently. They use a unified memory architecture where as a typical APU / CPU w/ iGPU is simply allocating a portion of main memory to use as VRAM.
There are tradeoffs between different types of memory; GDDR while being significantly higher speed, it is also significantly higher latency than main system memory.
It is not just "speed", the unified memory on a PS5 isn't faster than the VRAM on any current generation dGPU. One of the primary benefits of the unified memory architecture on consoles is that the CPU cores and the GPU cores are sharing the same memory address space. If the CPU needs to process something such as a texture, and then the GPU needs to process that same texture doesn't need to be copied from "main system memory" to "vram". Xbox Series X|S are more like a traditional PC APU however still a bit different. It still has 16GB of GDDR6, like the PS5, however it is split into two separately addressable memory spaces with 10GB of it "GPU optimized" with 560GB/s bandwidth (but with higher latency), and 6GB of it "system optimized" at 336GB/s bandwidth running slightly slower, but with lower latency.
Concurred. Current gen consoles will be a limiting factor in regards to the OP's question. On a PC with a dGPU with 12GB VRAM and a reasonable amount of system memory you'll at least be able to run games launching for current gen consoles at similar or better graphics/performance to said consoles.
Will you be limited in regards to some PC graphics features / quality levels, sure. But you'll certainly be able to play the majority of games with decent quality on a dGPU with 12GB VRAM.
I think the better question for the OP is which specific dGPU are they looking at which has 12GB of VRAM. If it is the 4070 Ti... then I'd say it would be nonsensical to get that GPU rather than getting a 4070 Ti Super
So some developers already triyed what is the "huge amount of ram in videocard".
If someone, like AMD, will release their APU with the same architecture in the nearest feature, it will overthrow all GPU market direction.
But right now, for PC, its looks ok, if it will be the same swamp like it was.