C1REX 9 de jan. às 13:56
Why Nvidia is giving not enough VRAM? - rant.
New GPUs were announced.

16GB for $1000+ just to match PS5 VRAM.
12GB for $550+ just to match the incoming Switch 2.

If the 5060 will have 8GB, it will match the PS4.

I understand and agree that many devs are doing a very bad job recently, but this tiny amount of memory for ridiculous money only makes it worse. Sure, games can be optimized for VRAM better, but it often comes at the cost of worse looking textures, stutters, or increased CPU demand for data streaming and decompression.

Nvidia tries to condition us into believing that VRAM is made of gold and ultra expensive. Same as what Apple does. Nintendo is super stingy, and they understand that VRAM gives more than computing power for less money. It would be cheaper for us to have games optimized for more memory but weaker GPUs. Nintendo gives 12GB to a GPU similar to the RTX 2050.

BTW: Something odd is happening on the forum. My previous topic was closed for being too old when it was made just before Christmas. Other topics commenting on Nvidia were also closed.
Última edição por C1REX; 9 de jan. às 13:58
< >
Exibindo comentários 3145 de 142
Heretic 10 de jan. às 12:52 
I'm using 12gb vram but I wouldn't recommend anything less today. I'm just glad I got the 12gb 3080 and not the 10gb version. The extra 2gb holds up in 4k+ VR, but I'm nearing the cusp. The 5060, which is two generations ahead, should have 12gb, the 5070 16gb and the 5080+ 20/24gb.
Monk 10 de jan. às 12:52 
I'm saying at 4k.

Also, it's not the same memory chips, speed or performance.

That you cannot even read a spec sheet correctly really says you shouldn't be commenting.
r.linder 10 de jan. às 12:57 
VRAM capacity is not the issue, it's usage, and they're working on reducing it.

https://www.youtube.com/watch?v=dx4En-2PzOU

Considering that most people here don't even like using Frame Generation or RT, 12GB is enough for 1440p. Also personally went from 10GB to 16GB and didn't notice a difference at 1440p.
Última edição por r.linder; 10 de jan. às 13:01
C1REX 10 de jan. às 12:57 
Escrito originalmente por Monk:
I'm saying at 4k.
Only 2 games need more than 8GB at 4K in your opinion? Really?
Monk 10 de jan. às 13:01 
Escrito originalmente por C1REX:
Escrito originalmente por Monk:
I'm saying at 4k.
Only 2 games need more than 8GB at 4K in your opinion? Really?

Without optional extreme high res texture packs, pretty much, I can only think I noticed more being needed a couple times.

Not saying the games will run well at 4k, but thst is far more down to the gpu performance not being upto the job than a limit of vram.

As I said, I tend to try stuff on both my 8GB 3050 and my 4090.

Edit.

Check my games list or tell me a game you'd like me to test, if I have it, I'll install it on the htpc and see if the limit is the gpu or the vram limitation.

Again, standard texture packs/ no mods, very few games I've seen NEED more, they may allocate more, but thst doesn't mean it is needed.
Última edição por Monk; 10 de jan. às 13:04
_I_ 10 de jan. às 13:04 
Escrito originalmente por Monk:
I'm saying at 4k.

Also, it's not the same memory chips, speed or performance.

That you cannot even read a spec sheet correctly really says you shouldn't be commenting.
^ this
nvidia rtx 50xx gpus use ddr7, not the same price or performance as desktop ddr5

dedicated gpu ram has always been faster than desktop ram at the time

its why consoles only use gpu vram speeds for their cpus
C1REX 10 de jan. às 13:25 
Escrito originalmente por r.linder:
Considering that most people here don't even like using Frame Generation or RT, 12GB is enough for 1440p.

I agree with you. Currently 12GB feels just enough for almost any game at 1440p without RT and FG.
However I have two questions to ask:

1. Is 12GB enough now because there is no need for more or because studios have no choice? Can we even ask studios to release optional, non-butchered, not stupidly compressed high resolution textures?
Or such full quality textures will only make people upset like completely optional quality textures for Space Marine2?

2. Do you think the VRAM demand won’t change in the next 2, 4 or 6 years?
Última edição por C1REX; 10 de jan. às 13:32
C1REX 10 de jan. às 13:31 
Escrito originalmente por Monk:
As I said, I tend to try stuff on both my 8GB 3050 and my 4090.

Edit.

Check my games list or tell me a game you'd like me to test, if I have it, I'll install it on the htpc and see if the limit is the gpu or the vram limitation.

Dragon’s Dogma 2 is one of many games that use more than 8GB at native 4K.
However, Capcom games simply don’t load high quality textures if there is not enough VRAM so it won’t cause stutters or other performance problems on 8GB. It will just look worse without telling the player.

Here is one of many benchmarks showing it uses over 10GB at native 4K.

https://youtu.be/TWzqqMVqjKo?si=pHk7mtPIdfCrGfwG
r.linder 10 de jan. às 13:33 
Escrito originalmente por C1REX:
Escrito originalmente por r.linder:
Considering that most people here don't even like using Frame Generation or RT, 12GB is enough for 1440p.

I agree with you. Currently 12GB feels just enough for almost any game.
However I have two questions to ask:

1. Is 12GB enough now because there is no need for more or because studios have no choice? Can we even ask studios to release optional, non-butchered, not stupidly compressed high resolution textures?
Or such full quality textures will only make people upset like completely optional quality textures for Space Marine2?

2. Do you think the VRAM demand won’t change in the next 2, 4 or 6 years?
1. Game developers and studios can't just keep raising memory requirements because you eventually alienate people with older video cards that can't afford to upgrade or don't want to for their own reasons. Their GPU might be fine in terms of performance but lacks in VRAM like the RTX 3070, they have to take everyone into consideration, not just the people who are buying new video cards every few years.

2. VRAM does go up at least somewhat every decade, look at GTX 900 series versus GTX 1000, VRAM basically doubled last decade and the gap is even larger in this decade. However, the lower end tiers won't need 16GB for quite some time, even the mid range is fine with 12GB for the time being.

There is zero argument to be made for futureproofing because it's an absolute myth outside of a vacuum where there is no competition, NVIDIA's competing with two brands that offer 12GB at a much lower price yet they still continue to do what they do because they know what they're doing and they're working to improve the situation in their own way where they have better use of their resources than to waste money on capacity that people angrily want but don't actually need.
You can have all of this excessive VRAM but if the core can't keep up regardless, it's just a waste. By time the 4090's 24GB is actually going to matter for games, when they actually need that much VRAM, the 4090's going to be considered slow anyway.

Really if you want to complain about VRAM all of the time, just buy one of AMD's cards and stop complaining. You're really not helping yourself by constantly ranting about NVIDIA whenever they have a product release.
Última edição por r.linder; 10 de jan. às 13:34
C1REX 10 de jan. às 13:57 
Escrito originalmente por r.linder:

1. Game developers and studios can't just keep raising memory requirements because you eventually alienate people with older video cards that can't afford to upgrade or don't want to for their own reasons. Their GPU might be fine in terms of performance but lacks in VRAM like the RTX 3070, they have to take everyone into consideration, not just the people who are buying new video cards every few years.

Really if you want to complain about VRAM all of the time, just buy one of AMD's cards and stop complaining. You're really not helping yourself by constantly ranting about NVIDIA whenever they have a product release.


I'm not even talking about raising minimum requirements. Space Marine 2 optional textures look stunning but they've got mixed reviews. A free, optional update made people upset.

https://store.steampowered.com/app/3202690/Warhammer_40000_Space_Marine_2__4K_Texture_Pack/

So devs can't even give us their original, uncompressed textures at the moment because it will likely cause bad PR. 32GB on 5090 will be useless for gaming not because devs can't fill it up and make games better looking for close to no effort and no performance cost but because they will be forced to ship games with crappy textures that can fit in 8GB. Likely at a performance cost due to data streaming and heavy CPU load.

Buying AMD won't change anything. Close to 10% market share is not enough to be relevant. Devs will aim at 4060 or 5060. 3060 12GB if we are lucky.
Última edição por C1REX; 10 de jan. às 14:02
r.linder 10 de jan. às 14:04 
Escrito originalmente por C1REX:
Escrito originalmente por r.linder:

1. Game developers and studios can't just keep raising memory requirements because you eventually alienate people with older video cards that can't afford to upgrade or don't want to for their own reasons. Their GPU might be fine in terms of performance but lacks in VRAM like the RTX 3070, they have to take everyone into consideration, not just the people who are buying new video cards every few years.

Really if you want to complain about VRAM all of the time, just buy one of AMD's cards and stop complaining. You're really not helping yourself by constantly ranting about NVIDIA whenever they have a product release.


I'm not even talking about raising minimum requirements. Space Marine 2 optional textures look stunning but they've got mixed reviews. A free, optional update made people upset.

https://store.steampowered.com/app/3202690/Warhammer_40000_Space_Marine_2__4K_Texture_Pack/

So devs can't even give us their original, uncompressed textures at the moment because it will likely cause bad PR. 32GB on 5090 will be useless for gaming not because devs can't fill it up and make games better looking for close to no effort and no performance cost but because they will be forced to ship games with crappy textures that can fit in 8GB. Likely at a performance cost due to data streaming and heavy CPU load.

Buying AMD won't change anything. Close to 10% market share is not enough to be relevant. Devs will aim at 4060 or 5060. 3080 12GB if we are lucky.
So you're basically saying that developers shouldn't give a crap about people with 8GB GPUs that didn't necessarily release this generation because you want higher resolution textures right from launch.

I hope you realise that in some cases developers did the opposite and had to downgrade graphics; Cyberpunk 2077 was much more demanding at launch than it was up until they released Phantom Liberty, they had to downgrade the graphics because most people couldn't even run the game as they didn't have a fast enough GPU. Even now it's still difficult to run on low end hardware.

Fallout 4 also had a 4K texture DLC and it was absolute garbage, it wasn't optimised so it just killed performance. Modders did a much better job making 4K textures than Bethesda did, so that just adds to the argument that the problem is developers not doing their job well enough, not semiconductor manufacturers for not doubling VRAM every few generations when most people aren't in the market to begin with. That would only be aiding forced obsolescence, and if NVIDIA is as bad as people claim, why wouldn't they be doing that then? It's not their goal.

Most users on Steam are still around GTX 16 series to RTX 3060 level. Most people can't afford to have expensive hardware to run the latest games at max settings, so why should the requirements be increased when it's only going to reduce the number of people that actually buy the game? How is that going to help the gaming industry if nobody's buying games they can't run?
Última edição por r.linder; 10 de jan. às 14:10
C1REX 10 de jan. às 14:19 
Escrito originalmente por r.linder:
So you're basically saying that developers shouldn't give a crap about people with 8GB GPUs
Of course I don't want that. Games should scale as low as possible so more people can play a game.
But I want more options.
I want optional, uncompressed textures.
And optional, bigger VRAM buffer to reduce stuttering.

Again, just an option. Some games do scale very well. Some games can run on an old laptop and steam deck but can also utilise stronger machines to make games look breathtaking.
r.linder 10 de jan. às 14:23 
Escrito originalmente por C1REX:
Escrito originalmente por r.linder:
So you're basically saying that developers shouldn't give a crap about people with 8GB GPUs
Of course I don't want that. Games should scale as low as possible so more people can play a game.
But I want more options.
I want optional, uncompressed textures.
And optional, bigger VRAM buffer to reduce stuttering.

Again, just an option. Some games do scale very well. Some games can run on an old laptop and steam deck but can also utilise stronger machines to make games look breathtaking.
That's called optimisation, more VRAM isn't going to solve your problems, it's only going to create more problems and turn gaming into more of a luxury, make it harder for the people who don't have a lot of money to be able to partake and enjoy it.

NVIDIA's trying to get VRAM usage down on their side so they don't have to increase VRAM to compensate for garbage quality development work, and it would keep their GPUs viable for longer, because these changes to DLSS aside from multi-frame generation? They affect ALL GPUs that support DLSS. So even older 8GB models like the 3070, 2080, etc. can benefit from it.

As DLSS and FSR increase in support, it's going to allow people with older GPUs to keep their hardware for longer and that benefits the gaming industry far more than what you're proposing.
Última edição por r.linder; 10 de jan. às 14:28
C1REX 10 de jan. às 14:32 
Escrito originalmente por r.linder:
That's called optimisation, more VRAM isn't going to solve your problems, it's only going to create more problems and turn gaming into more of a luxury, make it harder for the people who don't have a lot of money to be able to partake and enjoy it.
So do you consider optional 4K textures released after premiere as a bad move that only alienates people? Same for optional settings that only high end hardware can utilise (path tracing)?
r.linder 10 de jan. às 14:38 
Escrito originalmente por C1REX:
Escrito originalmente por r.linder:
That's called optimisation, more VRAM isn't going to solve your problems, it's only going to create more problems and turn gaming into more of a luxury, make it harder for the people who don't have a lot of money to be able to partake and enjoy it.
So do you consider optional 4K textures released after premiere as a bad move that only alienates people? Same for optional settings that only high end hardware can utilise (path tracing)?
You're not being alienated because you have to download a DLC that gives you higher resolution textures so I don't know where you're going with this.

They can push visuals as much as they want, but they need to optimise it properly so more people can enjoy it, otherwise they aren't going to sell as many units which defeats the whole point of making and selling a game. But if they keep throwing more memory into the mix then it's inevitably going to push the minimum requirements higher at a potentially faster rate than anticipated, which will leave society's poorest demographics in the dust for those games.

AI/Machine learning can easily compensate for video memory, that's part of what DLSS 4.0 is doing, and if it works well for the 5070, then it'll work for the 4070, 3080, etc.
Última edição por r.linder; 10 de jan. às 14:40
< >
Exibindo comentários 3145 de 142
Por página: 1530 50

Publicado em: 9 de jan. às 13:56
Mensagens: 142