Why is Nvidia and AMD so stingy on VRAM?
I get that with certain chips, it can only be either 8 or 16 GB or 24 GB, but since you had cards with 10 or 12, then clearly it's possible to set a new baseline standard of a minimum VRAM count. After all, NO ONE thinks that selling a 4 GB VRAM in 2025 would be "reasonable" by ANY measure, and yet 8 TB might as well be the new "4 GB". Yet, Nvidia continues to have them as an option in the 60xx series. What, is VRAM THAT expensive?!
< >
Showing 1-15 of 28 comments
C1REX Mar 14 @ 10:04am 
VRAM is not free but still less expensive than the GPU die. Many people believe that GPUs could be cheaper and games feel better optimised if some of the computing part could be traded for more memory so games could be optimised to use more VRAM.
More of cheaper silicon and less of expensive GPU silicon.

5060 and 5060Ti will potentially get obliterated in reviews if they get 8GB. It’s already not enough and will likely age horribly in 4years. PS6 is rumoured to be released in 2-3years with up to 32GB of VRAM.
its to push you to buy atleast a 70/80 class card.
A&A Mar 14 @ 10:08am 
When you see the same VRAM modules in AliExpress being dirty cheap.
Tiberius Mar 14 @ 10:08am 
The cheaper gpu only uses 128bit (4*32) memory bus, and each 32bit memory bus can only slot 2gb of vram chip. Thus you get a 8gb (4*2gb) vram card.

I remember reading somewhere, the memory bus is what's mainly driving the cost up
Lixire Mar 14 @ 10:14am 
Simple.. money...
NVIDIA especially wants to upsell you on the higher tier of cards and use VRAM as the driving force.
nullable Mar 14 @ 10:24am 
Well, some of it is users overestimate or exaggerate the need value of more VRAM. Yes, I know people can did up examples where X game uses a lot of VRAM. And we spent a long time watching VRAM amounts increases steadily and then that really tapered off.

Secondly, VRAM tends to need to be higher performance than standard desktop/laptop dimms.

https://en.wikipedia.org/wiki/GDDR7_SDRAM 1.5 TB/sec

vs

https://en.wikipedia.org/wiki/DDR5_SDRAM 64GB/sec

That high level of performance for GDDR7, or 1.1TB/sec for GDDR6 probably means it's expensive and has a lower yield rate than something more modest like DDR5 or DDR4.

I think there's a lot of competing values and needs happening here. And I understand the comfort of having an ample amount of VRAM versus only what's strictly necessary. Some customers want more VRAM every new generation, it was pretty typically up until the 10 series. And that was launched about 9 years ago now.

Consoles may be having some impacts there dragging PC hardware progress down a bit. There's also a finite amount of VRAM supply and since Intel threw its hat in that might add a bit of strain to supply. Not to mention AI hardware also scarfing up that RAM too. And I don't think PC gamers would be thrilled if the supply of GPUs was more limited to accommodate more VRAM per card. It would drive prices up and that wouldn't go over well. And it's not like AMD or Nvidia are hoarding VRAM like dragons hoard gold and refuse to share it.

I think there's just a lot of details that explain what we're seeing: supply, demand, need, want.
Last edited by nullable; Mar 14 @ 10:26am
Originally posted by Bing Chilling:
its to push you to buy atleast a 70/80 class card.
5080 has only 16GB of vram, right on the edge of not being enough for todays games, and plenty of games easily use more than 16gb already. 5070 straight up has too little vram and cant even take full advantage of dlss4. the only high end GPU worth buying is the 5090. rtx 4090 easily beats the 5080 in almost all games by 10%-30% more performance. 5080 playing demanding games with max settings & path tracing simply runs out of vram dlss4 can't even save it. these are GPUs for 1080p & 1440p gaming end even at 1440p they have too little vram to handle modern graphically demanding games at maxed out settings.
The Grin Mar 14 @ 10:49am 
Originally posted by ΜΣ†ΛĿ:
Originally posted by Bing Chilling:
its to push you to buy atleast a 70/80 class card.
5080 has only 16GB of vram, right on the edge of not being enough for todays games, and plenty of games easily use more than 16gb already. 5070 straight up has too little vram and cant even take full advantage of dlss4. the only high end GPU worth buying is the 5090. rtx 4090 easily beats the 5080 in almost all games by 10%-30% more performance. 5080 playing demanding games with max settings & path tracing simply runs out of vram dlss4 can't even save it. these are GPUs for 1080p & 1440p gaming end even at 1440p they have too little vram to handle modern graphically demanding games at maxed out settings.

I am sorry but where do you exactly find your " not enough 16 GB VRAM Games " because I must have an incredible Graphic card then.

My RTX 3080 with 10 GB must have " hidden dormant additionnal 6 GB" and still I find strange to be able to play most of todays heavy games, fully modded and at 60 FPS if not more without any problems.

The major problem is not the "VRAM pool", the problem nowadays is optimization.
I see so many people on the forums with the same games I have, having so many more problems with 16 or 24 GB VRAM cards, while it works flawlessly on my side.

But yeah most of them also want to play on theatre movie screens and expect stable 1000 fps
because 60 FPS is not enough for them.

Guess how many FPS the human eye can percieve ?

This race for fake performance is exactly why the industry needs sheeps and they feed them with DLSS to blurr the games out for fake FPS and boosted performance, speaking about a revolution (ex: Monster Hunter wilds).
But it's all a lie. It's a huge downgrade.
Andrius227 Mar 14 @ 11:04am 
Vram doesnt matter much. I upgraded from 24gb (3090) to 16gb (4080) a couple of years ago and it made no difference.

Planning to upgrade to a 5080 which also has 16gb.
Last edited by Andrius227; Mar 14 @ 11:05am
Originally posted by The Grin:
Originally posted by ΜΣ†ΛĿ:
5080 has only 16GB of vram, right on the edge of not being enough for todays games, and plenty of games easily use more than 16gb already. 5070 straight up has too little vram and cant even take full advantage of dlss4. the only high end GPU worth buying is the 5090. rtx 4090 easily beats the 5080 in almost all games by 10%-30% more performance. 5080 playing demanding games with max settings & path tracing simply runs out of vram dlss4 can't even save it. these are GPUs for 1080p & 1440p gaming end even at 1440p they have too little vram to handle modern graphically demanding games at maxed out settings.

I am sorry but where do you exactly find your " not enough 16 GB VRAM Games " because I must have an incredible Graphic card then.

My RTX 3080 with 10 GB must have " hidden dormant additionnal 6 GB" and still I find strange to be able to play most of todays heavy games, fully modded and at 60 FPS if not more without any problems.

The major problem is not the "VRAM pool", the problem nowadays is optimization.
I see so many people on the forums with the same games I have, having so many more problems with 16 or 24 GB VRAM cards, while it works flawlessly on my side.

But yeah most of them also want to play on theatre movie screens and expect stable 1000 fps
because 60 FPS is not enough for them.

Guess how many FPS the human eye can percieve ?

This race for fake performance is exactly why the industry needs sheeps and they feed them with DLSS to blurr the games out for fake FPS and boosted performance, speaking about a revolution (ex: Monster Hunter wilds).
But it's all a lie. It's a huge downgrade.
CP 2077 maxed out uses around 20 GB, indiana jones with PT easily uses 20GB, FF15 uses more than 16gb, Monster hunter wilds uses more than 16gb, Star wars Jedi survivor uses more than 16GB. many other games with high rez texture packs use alot of vram, modded games uses a ton of vram. If you read the forums on new games releases people are always complaining about crashing, stuttering, memory leaks etc, and everyone with the 24gb 4090's says the games are working perfectly fine. most of these issues are because people are trying to run these games maxed out at decent resolutions, but there is simply not enough vram to do so. Even consoles have 16GB of vram and they use it too. no one wants to spend $1000-$1500 on a 5080 to play at reduced settings and/or resolutions because it doesn't have enough vram.
60 FPS is fine but you can definitely notice the difference between 60 & 120 fps when looking at a monitor close up, and can definitely feel the input delay when using a mouse. besides higher FPS doesn't directly use more Vram, Frame generation however does use quite a bit of vram this is why the 5070 with the lousy 12gb vram has major problems because it can't even run FG properly. and it needs to because the GPU is too weak to hit high frames without it.

the biggest problem actually is the lack of vram and is the major reason for bad performance in PC gaming when trying to run games maxed out.
Dodece Mar 14 @ 11:21am 
Someone is off in their own little world it appears. They are servicing the lower end of the market. Consumers that have less disposable income, and for whom a lower price point is paramount. In other words these cards are more accessible. They aren't needed for professional work loads, or for being on the cutting edge of gaming.
Originally posted by Dodece:
Someone is off in their own little world it appears. They are servicing the lower end of the market. Consumers that have less disposable income, and for whom a lower price point is paramount. In other words these cards are more accessible. They aren't needed for professional work loads, or for being on the cutting edge of gaming.
a 5080 isn't exactly low end and for $1500 should be able to max out the graphics settings without running out of vram.
gwwak Mar 14 @ 11:25am 
Planned obsolescence. Nvidia doesn't want another 1080 Ti situation or people hanging onto their GPUs for 7+ years.
Originally posted by Andrius227:
Vram doesnt matter much. I upgraded from 24gb (3090) to 16gb (4080) a couple of years ago and it made no difference.

Planning to upgrade to a 5080 which also has 16gb.
why would you sidegrade to a 5080 when the performance is almost the same, unless you can sell the 4080 for a decent price and only pay like a 100 bucks for the 5080. otherwise it makes no sense besides 4x FG, and you will still run out of vram in some games forcing you to lower settings, then the FG won't even matter.
Originally posted by gwwak:
Planned obsolescence. Nvidia doesn't want another 1080 Ti situation or people hanging onto their GPUs for 7+ years.
this is pretty true. like i'm still on a 1070 and i'm just now going to upgrade it.
it's still ok for games but it's showing its age hard in any AAA games released in 2024
< >
Showing 1-15 of 28 comments
Per page: 1530 50