This topic has been locked
C1REX Nov 22, 2024 @ 11:44am
NVidia VRAM limitation is damaging to Pc gaming - rant
Rant post as Nvidia is really making me salty. They’ve done some weird voodoo magic to people who defend their ridiculously low amount of memory. 8GB is ridiculous. 12GB for mid-tier is absurd. 16GB for high-tier is insulting. And they allegedly plan to do the same for their next gen.

VRAM is the biggest request from devs to make games run and look better. It can help with stutters as well. Some people say that more VRAM can make a much bigger difference for games than more performance while costing less.

Recent Final Fantasy 16 is using almost 11GB at 1080p. Warhammer 40K: Space Marine 2 recently got an update - 90GB download with stunning 4K textures.

The game was beautiful at launch when using 8GB VRAM, but it’s jaw-droppingly impressive now but needs over 20GB of VRAM. Luckily, there is no penalty to performance.

https://youtu.be/S84bZaaSHTA?si=QhWsEgvvWD3NObVI

It will be depressing if Nvidia releases new GPUs with less memory than consoles. People will keep blaming devs for Nvidia's criminally low memory. :(
Last edited by C1REX; Nov 22, 2024 @ 11:49am
< >
Showing 1-15 of 51 comments
_I_ Nov 22, 2024 @ 11:55am 
gpu with about half that of system ram has been a normal build for a long time

it is kinda the devs fault for not optimizing the games for pc
not using the system ram/vram effectively as they could
but instead making the game run good on console and assume an over specd' pc can do the same tricks
C1REX Nov 22, 2024 @ 12:16pm 
Originally posted by _I_:
gpu with about half that of system ram has been a normal build for a long time

it is kinda the devs fault for not optimizing the games for pc
not using the system ram/vram effectively as they could
but instead making the game run good on console and assume an over specd' pc can do the same tricks
System RAM on PC is mostly used to feed VRAM.
PC moves data to system RAM first, decompress and then holds it there until it needs to be moved to VRAM.

System RAM is too slow for graphics what is very noticable when you run out of vram or on computers with shared memory.
When PC has shared memory it still dedicates some to the GPU like some sort of a partition and needs to copy the same data from CPU portion to the GPU dedicated part of memory. It’s so extremely inefficient.

Consoles don't need system RAM as data can be copied directly from fast nvme drive thanks to DirectStorage, unified memory and dedicated hardware decompression chip. So consoles can use nvme as RAM in a way.
PC has Direct Storage but without hardware decompression chip and no unified memory.
Unified means that CPU and GPU can access the same data on the same memory without dividing into CPU and GPU part. Very different to shared memory. Not to mention the speed difference between DDR5 and GDDR6.



Anyway…
3080 10GB having less memory than 3060 12GB is absurd.
4070Ti having as much memory as low tier and generation older 3060 is unacceptable.
VRAM is not made of gold and is not that expensive.
Last edited by C1REX; Nov 22, 2024 @ 12:16pm
Electric Cupcake Nov 22, 2024 @ 12:46pm 
Nvidia is obsolete anyway.
Mad Scientist Nov 22, 2024 @ 12:55pm 
Originally posted by C1REX:
Rant post as Nvidia is really making me salty. They’ve done some weird voodoo magic to people who defend their ridiculously low amount of memory. 8GB is ridiculous. 12GB for mid-tier is absurd. 16GB for high-tier is insulting. And they allegedly plan to do the same for their next gen.
Depends on the affordability, expected performance etc.

Personally I only like gpus with 2 digits for the amount of vram. I can easily use 10+GBs for VRAM. So I'd prefer 16 or more, bit you don't really need more more more for gpus compared to system ram, though it's he mice is units above the 50 for a series, xx60 and higher would come with 12 or higher.

Units 8GB and below are usually mid gaming or less, 1/2/4 are usually for basic use maybe light games.

So its for the job/performance, price etc. Though I would prefer no less than 16GBs for xx70 and higher lineups.
Midnight Aurais Nov 22, 2024 @ 2:09pm 
yeah i was planing to go a bit high end with nvidia then i saw the 12 gb and if you follow the rabbits hole to get 16 frick your 100's on top of your initial price bracket it is totally bs

worse thing is a lot of them are 4k capable they actively get smuthered likely to protect like the 4080

i just went with amd experience good just a few games that would have worked better on nvidia if i wanted raytracing but the 6950 xt does its job

now i am building a 8700g system for a low powered second pc allready saw someone play it 30-40 fps with fsr balanced while the comparitive gpu the 1650 bits the dust being heavily vram limited in stalker 2 running under 10 fps at fsr performance

the 780m in the 8700g can allocate up to 16 gb for vram its a 32 gb build so you can guess that ill fully utilize the 16 XD

the worst news thought is that nvidia is artificially inflating the gpu market again with a so called gpu shortage for the launch of the 50 series and there wil be a price hike again
Last edited by Midnight Aurais; Nov 22, 2024 @ 2:10pm
_I_ Nov 22, 2024 @ 4:56pm 
Originally posted by Electric Cupcake:
Nvidia is obsolete anyway.
not with amd no longer making high end gpus
r.linder Nov 22, 2024 @ 9:02pm 
AMD does the exact same thing... They hold back on VRAM too but people pretend it's just an nVidia issue. The keep their low end at 8GB and their mid range is still 12GB.

8GB in the low end is fine, extra VRAM will just jack up the price even more, and the cards usually don't perform well enough in many cases to even properly take advantage of the excess VRAM to begin with, the 7600-XT and 4060 Ti are both plagued with that.

The complaints are rooted in the fact that AMD offers 16GB in the same price area as the 4070, but that's because they don't have a choice. They're competing, and AMD doesn't have anything to lean on, they either deliver a good product or they crash. If nVidia had offered 16GB with the standard 4070 then nobody would be complaining.
Last edited by r.linder; Nov 22, 2024 @ 9:09pm
The_Abortionator Nov 22, 2024 @ 9:21pm 
Originally posted by _I_:
gpu with about half that of system ram has been a normal build for a long time

it is kinda the devs fault for not optimizing the games for pc
not using the system ram/vram effectively as they could
but instead making the game run good on console and assume an over specd' pc can do the same tricks

Trying to equate VRAM to system RAM. Theres no related ratio what so ever, thats not how this works.
The_Abortionator Nov 22, 2024 @ 9:23pm 
Originally posted by _I_:
Originally posted by Electric Cupcake:
Nvidia is obsolete anyway.
not with amd no longer making high end gpus

Halo products never break 1~2% population. Not sure why people have brain worms and pretend everyone is out buying 4090TIs.

Releasing *80 tier and below cards is more than enough especially considering people mostly buy in the *60 tier.
The_Abortionator Nov 22, 2024 @ 9:38pm 
Originally posted by r.linder:
AMD does the exact same thing... They hold back on VRAM too but people pretend it's just an nVidia issue. The keep their low end at 8GB and their mid range is still 12GB.

8GB in the low end is fine, extra VRAM will just jack up the price even more, and the cards usually don't perform well enough in many cases to even properly take advantage of the excess VRAM to begin with, the 7600-XT and 4060 Ti are both plagued with that.

The complaints are rooted in the fact that AMD offers 16GB in the same price area as the 4070, but that's because they don't have a choice. They're competing, and AMD doesn't have anything to lean on, they either deliver a good product or they crash. If nVidia had offered 16GB with the standard 4070 then nobody would be complaining.

Wow, just. WOW.

Yeah, no duh people wouldn't be complaining about VRAM on the 4070 if it had an appropriate amount. Thats literally the issue.

Second, you claim AMD is doing the same thing but no, they aren't. You have to go all the way down to the 7600 to find 8GB of VRAM and thats actually a card nobody should buy regardless of VRAM.

Everything above has 16GB or more of VRAM. No 8GB at the price points Nvidia sells at is not fine. Comparing the 8GB vs 16GB models already shows you lose 34% performance when you only have 8GB of VRAM at 1080p.

Trying to say these cards can't benefit from more VRAM, or that it'll be pricier, or whatever nonsense is full blown COPE and nothing more.

Its just as dumb as the clown show that is recommending the 4070 over the 7800xt when the 7800xt is flatout a better card.

The 7800xt released for $100 cheaper, has 50% more VRAM, beats the 4070 handedly in raster, and goes toe to in RT.

But I'm sure you'll find some illogical reason as to why the better card is actually worse.
The_Abortionator Nov 22, 2024 @ 9:42pm 
When a company stop giving you what you want you leave.

I spent 20 years buying from Nvidia and ironically they chose to offer the worst competition in cards around the same time AMD got their ♥♥♥♥ together so I dipped.

Intel didn't invest in R&D and sat on their lorals and AMD took their sandwich so I dipped.

If AMD does a ♥♥♥♥♥ wucky I'll drop them too but as of right now Nvidia offers less VRAM, less raster, and worse Linux support for more money. Why would I buy from them?
r.linder Nov 22, 2024 @ 9:51pm 
Originally posted by The_Abortionator:
Second, you claim AMD is doing the same thing but no, they aren't. You have to go all the way down to the 7600 to find 8GB of VRAM and thats actually a card nobody should buy regardless of VRAM.

Originally posted by The_Abortionator:

Releasing *80 tier and below cards is more than enough especially considering people mostly buy in the *60 tier.
> Nobody should be buying the 7600 regardless of VRAM

>> but people mostly buy x60 tier

Watching your mind at work is entertaining. Most people can only afford to buy the lower end tier and somehow they're wrong for that.

And yes AMD does do the same thing, the 7700-XT is a x70 tier card with 12GB yet the 7600-XT is 16GB... because they mirrored nVidia to cut costs on the 7600 and 7700-XT and compete with the 4060 Ti 16GB.

Gotta love how people get mad at nVidia for things but then completely overlook that AMD does the same things.

Originally posted by The_Abortionator:
Trying to say these cards can't benefit from more VRAM, or that it'll be pricier, or whatever nonsense is full blown COPE and nothing more.
Wrong, GamersNexus' testing easily concluded that the 7600-XT and 4060 Ti 16GB were unable to actually give great performance when running as much of their VRAM capacity as possible because they're too weak for that VRAM buffer.
MOAR VRAM!! does not mean MOAR BETTER!! when the core is too weak to keep up in the first place.
The_Abortionator Nov 23, 2024 @ 1:32am 
Originally posted by r.linder:
Originally posted by The_Abortionator:
Second, you claim AMD is doing the same thing but no, they aren't. You have to go all the way down to the 7600 to find 8GB of VRAM and thats actually a card nobody should buy regardless of VRAM.

Originally posted by The_Abortionator:

Releasing *80 tier and below cards is more than enough especially considering people mostly buy in the *60 tier.
> Nobody should be buying the 7600 regardless of VRAM

>> but people mostly buy x60 tier

Watching your mind at work is entertaining. Most people can only afford to buy the lower end tier and somehow they're wrong for that.

And yes AMD does do the same thing, the 7700-XT is a x70 tier card with 12GB yet the 7600-XT is 16GB... because they mirrored nVidia to cut costs on the 7600 and 7700-XT and compete with the 4060 Ti 16GB.

Gotta love how people get mad at nVidia for things but then completely overlook that AMD does the same things.

Originally posted by The_Abortionator:
Trying to say these cards can't benefit from more VRAM, or that it'll be pricier, or whatever nonsense is full blown COPE and nothing more.
Wrong, GamersNexus' testing easily concluded that the 7600-XT and 4060 Ti 16GB were unable to actually give great performance when running as much of their VRAM capacity as possible because they're too weak for that VRAM buffer.
MOAR VRAM!! does not mean MOAR BETTER!! when the core is too weak to keep up in the first place.


So, math isn't your strong suit.

*60 != *600 but thanks for the clown show.

Second, the 7600 non XT exists as a 4050 tier card NOT a 4060 tier card.

I also love how your "AmD dOeS tHe SaMe ThInG!!!" is AMD starting their lower mid end with 16GB and 12GB while Nvidias is 8GB.

Sorry, not the same thing.

Also, you kids have got to STOP with the bizarre belief that a card somehow must have some magical speed property to not be held back by VRAM limits.

VRAM is a resource pool, if you don't have enough you hit system RAM and your performance tanks.

https://www.youtube.com/watch?v=nrpzzMcaE5k

And this isn't even the only bench marks of this issue here.

Thanks for fanboying SO HARD and not being bothered to even google.

The VRAM difference is literally games at 4k vs 1080p for a lot of games.

Games like rachet and clank even at 1080p lose 34% performance with 8GB cards.

Its literally a case of buy from AMD or spend more do less with Nvidia.
A&A Nov 23, 2024 @ 1:49am 
Originally posted by The_Abortionator:
Second, the 7600 non XT exists as a 4050 tier card NOT a 4060 tier card.
RX7600 and RX7600XT have the same stream processor count, you know?
Still, the additional 8GB of VRAM makes little difference. 10% max
Last edited by A&A; Nov 23, 2024 @ 1:54am
C1REX Nov 23, 2024 @ 1:56am 
Originally posted by r.linder:
And yes AMD does do the same thing, the 7700-XT is a x70 tier card with 12GB yet the 7600-XT is 16GB... because they mirrored nVidia to cut costs on the 7600 and 7700-XT and compete with the 4060 Ti 16GB.
Lets be serous. By this logic 7900GRE, 7900XT and 7900XTX are all x90series cards. It's just a name that means nothing and AMD is a grand master in making up bad names. It's all about price.

3060 has 12GB and it's a good amount. 3070 has 8 and can't even run some games at 1080p or games have missing textures.

Prices according to pc part picker US:

3060 12GB - $265
3070 8GB - $430
4060 8GB - $284
4060Ti 8GB - $360
4060Ti 16GB - $430
4070 12GB - $490
4070 S 12GB - $590
4070Ti 12GB - $690
4070Ti S 12GB - $740
4080 (super) 16GB - $950

6600 8GB- $189
7600 8GB - $230
7600XT 16GB - $280
6700XT 12GB - $270 - (nice)
6750XT 12GB $290 - (nice)
7700XT 12GB - $370
7800XT 16GB - $440
7900GRE 16GB - $530
7900XT 20GB - $620
7900XTX 24GB - $830




Originally posted by r.linder:
Wrong, GamersNexus' testing easily concluded that the 7600-XT and 4060 Ti 16GB were unable to actually give great performance when running as much of their VRAM capacity as possible because they're too weak for that VRAM buffer.
MOAR VRAM!! does not mean MOAR BETTER!! when the core is too weak to keep up in the first place.
After 8GB version it was not possible to give 12GB as there were no 3GB modules at the time and you had to double to 16GB.

Also there are already games using 12GB at 1080p.
Many of them did the smart thing and hide VRAM limitation by secretly not giving you good textures ignoring your textures settings.

Here is a difference between 8 and 16GB.
https://www.youtube.com/watch?v=Rh7kFgHe21k

Why would anybody defend low VRAM when it's arguably the cheapest way to improve performance, visuals, stutters and lower development/optimisation time for devs? Games would feel so much better optimised if GPU had more vram. One of the reason why games can stutter is when they keep decompressing, loading and unloading data within small memory pool.
Last edited by C1REX; Nov 23, 2024 @ 1:59am
< >
Showing 1-15 of 51 comments
Per page: 1530 50

Date Posted: Nov 22, 2024 @ 11:44am
Posts: 51