xSOSxHawkens 2023 年 4 月 20 日 上午 10:45
Time is not being kind to nVidia and its VRAM hoarding...
https://www.techspot.com/article/2661-vram-8gb-vs-16gb/


引用自 "At a 1080p Rez..."

As we've seen a number of times now, the issues for the RTX 3070 start when we enable ray tracing. You could claim that the ultra quality preset is too much, but you'd also have to ignore the fact that the Radeon 6800 is good for 73 fps, while the 3070 is unusable thanks to its 8GB VRAM buffer. Worse still, the 6800 is still able to deliver highly playable performance at 1440p.

引用自 "In Hogwarts Legacy"
With the RTX 3070 we could notice the occasional jarring frame stutter, but the bigger issue is the image quality. The 3070 is constantly running out of VRAM and when it does the game no longer runs into crippling stuttering that lasts multiple seconds, instead all the textures magically disappear and then reappear at random. What you're left with is a flat horrible looking image and this occurs every few seconds. Even when standing completely still, the RTX 3070 keeps cycling textures in and out of VRAM as it simply can't store everything required for a given scene.

But what about DLSS?

If we enable upscaling (DLSS and FSR quality modes) for the RTX 3070 and the Radeon 6800, frame rate performance certainly improves but unfortunately the RTX 3070 just doesn't have enough VRAM, even when upscaling from a 720p image and we see the constant popping in and out of textures. Performance is also much worse on the 3070 when compared to the 6800, we're also still seeing stuttering, it's just a mess.

The next step is to dial down the quality preset to high, with high ray tracing, after all we don't typically recommend the ultra settings in most games. But even here the RTX 3070 suffers from regular frame stutter and in some instances severe frame stuttering.

Dropping down to the medium preset with ray tracing still set to high is a much improved experience with the RTX 3070, and texture quality is no longer an issue, though there is still the occasional frame stutter as VRAM usage is right on the edge and occasionally spills over.

__________________________________________________________________________

Personal take:

I have said for a long time that retaliative performance being close, two GPU's with differing VRAM capacities will have their usable life determined by that capacity, and thus more VRAM is worth it within reason.

Since the RTX3k/RX6k release I was critical of nvidia and its choice to be stingy and hoard VRAM and not put it on their cards. I regularly posited that the 16GB cards from AMD starting at the 6800 non-xt would offer better long term value than the 8 and 10 GB cards from the NV line up. I thought both back then, and now (and rightfully so it would seem) that spending money on something like a 3070 was an absolute waste. Seems it was indeed.

I pointed out when the 6500xt was out that the card itself was OK if you stuck under the VRAM, but was criticized that no one, even on an *entry* level card should have to be managing VRAM usage like that and having to optimize settings to get playable frames on hardware (again, even though it was entry). I pointed out that it would not be long before the same was having to be done with anything with less VRAM than it has usable core power (looking at you 8/10GB modern cards from NV).

Now upper mid range $500 class GPU's are having to compromise hard on settings, or simply live with literal texture pop in due to lack of VRAM to use.

2023 is here. And the 8GB 3070 cant even keep up with a 6800 non-xt in multiple ray traced titles anymore due to VRAM limits. It cant run quality 1440p anymore at reasonable high refresh, and at 1080p, its so bad at times that the 6800's* 1% lows match the 3070's average.

And the smoking gun in all of this? The reason the 3070 is (by comparison) now* trash?

VRAM and nvidia being unwilling to put it on the card without pushing costs up.

To be clear, both companies have cards with 8GB of VRAM, but one has 8gb cards positioned as high end, or upper mid range, both in price and product stack, while the other has the cards listed as mid range or lower and priced accordingly.

At the end of the day the only real people who lose out here are NV owners of 8 (and soon enough) 10 GB cards... While the only actual winner is NV.

Take a bow Jensen, you earned it.

* : Edits
最後修改者:xSOSxHawkens; 2023 年 4 月 20 日 上午 11:04
< >
目前顯示第 76-90 則留言,共 98
C1REX 2023 年 4 月 22 日 下午 2:51 
引用自 emoticorpse
Well, you are right in that at certain price ranges AMD does offer literally double the vram. I'm still interested in how cheap it actually is. Because looking at the whole picture, I would still wonder myself if the overall dynamics are really more fair then people here are making it seem since Nvidia does offer other benefits over AMD. So if Nvidia was basically AMD WITH the same amount of vram, ABSOLUTELY NOBODY would even buy AMD. AMD Might actually be offsetting the cost by adding more vram than they should because it's the only way to lure people to buy their stuff.
This is what I'm thinking right now.
Only 10% of people buy AMD, and until last month, nobody cared about VRAM at all. There is no way to be 100% sure how much VRAM costs exactly, as it varies based on contracts between companies. However, based on various comments, it seems that it costs around $20 to solve VRAM problems. I'm not sure what this means.

For me, the biggest arguments are that the 3060 came with 12GB, while the 3070 had only 8GB and the 3080's 20GB version was cancelled. Also, there was only a tiny price difference between the 7900XTX and the XT, even though there was a substantial performance difference and an additional 4GB of VRAM.


Think about this: If VRAM were expensive, AMD would use less VRAM and a larger die instead, or lower the prices even more with less VRAM to sell their GPUs at half the price of NVIDIA's.
最後修改者:C1REX; 2023 年 4 月 22 日 下午 2:55
emoticorpse 2023 年 4 月 22 日 下午 2:56 
引用自 C1REX-PL
...There is no way to be 100% sure how much vram costs exactly as it's per contracts between companies. However from various comments it seems it's like $20 to solve VRAM problem. ..

I'm not trying to be a jerk, but it sounded like you were well educated in the cost of vram but you really don't know and what you've read is just around here and there from other random people on here?

I mean if you have a link to someone credible, I suppose I wouldn't be typing this. If it really did cost $20 to add maybe like 8 GB of vram then yeah I suppose that's messed up (kind of).

But I would still say, in Nvidia's defense they're a business. If they CAN give less and make more, it's what they're here for to make money. Every business does it. They're LITERALLY doing AMD a favor, because they would put them out of business if they offered the same amount of vram.
Illusion of Progress 2023 年 4 月 22 日 下午 3:10 
引用自 emoticorpse
Or, in my case, just Minecraft and not even at 1440p...

When you say "just minecraft", do you mean that vanilla minecraft alone won't run well at 1440p/1080p with a midrange gpu?. IIRC, you liked to run mods on it? If that's the case is is the mods that demand the vram or just minecraft without mods still requires a lot of vram?
Sorry, yes, it's more than the vanilla game. But I don't really use "mods" outside of OptiFine (if you're not too familiar with it, it's a performance/features mod more than an actual gameplay/content mod). I use shaders, though, and that's mostly what does it. And resource packs (one in particular I think might be adding to the VRAM use). So maybe poor choice of words.

The use of the word "just" was to denote that it's a game that's already causing me VRAM issues at 1920 x 1200 (and at quite tame render distances), as opposed to being a recent triple A title at maximum settings at 4K or whatever.
引用自 Darkie
I can run any game so far on 1440p without issue and with 8gb.
Maybe so. I was just listing one that I play that didn't fall under your criteria of being a triple A title at maximum settings at 4K. Pretty far from it, in fact.

And there's no end to the supply of reviews and articles finding places where it can't be enough. So as someone "in the buying market", 8 GB isn't even a consideration for me. I'm even questioning the longevity of 12 GB, but that's only because I want the option to NOT have it be a potential concern 3 or so years from now.

My current card had a generous amount of VRAM for the time, and I believe that's what helped it last as long for me. So I'm sticking to the experience that worked for me and looking for something not borderline on VRAM.
emoticorpse 2023 年 4 月 22 日 下午 3:22 
...Sorry, yes, it's more than the vanilla game. But I don't really use "mods" outside of OptiFine (if you're not too familiar with it, it's a performance/features mod more than an actual gameplay/content mod). I use shaders, though, and that's mostly what does it. And resource packs (one in particular I think might be adding to the VRAM use). So maybe poor choice of words.

The use of the word "just" was to denote that it's a game that's already causing me VRAM issues at 1920 x 1200 (and at quite tame render distances), as opposed to being a recent triple A title at maximum settings at 4K or whatever...

One of the main things I was getting at with that question is that, do you think it's fair to say that if "vanilla" Minecraft works fine at 1080p/1440p then the developers of Minecraft as well as Nvidia provided exactly what was needed for that game?.
r.linder 2023 年 4 月 22 日 下午 3:48 
There is absolutely no good reason for NVIDIA doing what it's doing, it's purely greed as people would assume, because they've been raising the prices with every generation of RTX since the first one launched, and effectively gas-lighting people who complained about it, saying that they don't understand all of the hidden costs associated with these cards...

Yet, as per the stock market as of yesterday, NVIDIA's estimated net worth is 669 billion dollars. They've earned almost 27 billion in revenue so far this year and we're not even halfway done the year yet, with a net income (income after all taxes, deductions, costs taken out of revenue) of 4.3 billion, so it's safe to say that they really could have kept their pricing more competitive, or made better cards that would actually be closer to their MSRP in price/performance value. Instead, they decide to be as lazy as freaking possible, making the tiniest of changes with each generation, and wondering why their sales aren't meeting expectations, as if they thought we'd be that ♥♥♥♥♥♥♥♥ stupid, but we're not, hardly anyone that knows what they're looking at would want the 4070, which is basically just a 3070 Ti, and 40%+ slower than the 4080 in numerous occasions, for half the price, but you can already get better for the same price or less.

It's a damn good thing that they weren't allowed to acquire ARM Holdings from SoftBank, one can only imagine the damage they'd have caused to the industry if they had the same practices there as they do here.
最後修改者:r.linder; 2023 年 4 月 22 日 下午 3:51
AdahnGorion 2023 年 4 月 22 日 下午 3:50 
引用自 emoticorpse
引用自 CJM

The cost of VRAM is the #1 factor. If it is "inexpensive", then I can see the animosity being towards NVIDIA.

I was under the impression that VRAM is expensive. Especially during the global supply chain shortage during the global pandemic.

As the supply chain catches up, I have been expecting more VRAM to become available. However, I have heard murmurings that VRAM is fixing to go up in cost along with other components used in manufacturing GPUs.

Someone will have to define what "cheap" is. Because if it really was that "cheap" then AMD should be offering a lot more of it. Sure they offer more, but not THAT MUCH MORE. So I have a feeling people are throwing around "cheap" quickly, but in reality it's not as cheap as they're making it out to be.

One thing people don´t factor in, is the fact that Nvidia use GDDR6X unlike AMD´s GDDR6
The cost is a tad higher due to that ofc. But overall the AMD cards ofc are cheaper (they always have been) but overall performance and stability is just much better on the Nvidia card, a long with better software (in my opinion)

We could talk about the heat issues for AMD cards atm as well.

Prices will go higher still, due to inflation and current world conflicts. That is how it is.
Most people don´t need a high end GPU however, the vast majority will be fine with a entry level or mid tier GPU.

It would be silly to relaunch a 30th generation card with more Vram, if anything, then it should be a relaunch later on for the 40th generation (but tbh.. 12/16 gb vram is more than enough for the mid/high tier. Entry level with 8 gb is just fine for 1080p
Illusion of Progress 2023 年 4 月 22 日 下午 4:23 
引用自 emoticorpse
One of the main things I was getting at with that question is that, do you think it's fair to say that if "vanilla" Minecraft works fine at 1080p/1440p then the developers of Minecraft as well as Nvidia provided exactly what was needed for that game?.
For what purpose are you asking that question? I blame neither Mojang nor nVidia for the fact that my use with Minecraft needs more VRAM than I have, if that's why you're asking.

But nVidia doesn't provide hardware for games on a per game basis. They just sell hardware to consumers (and right now, one of those demands of many consumers is "you're providing too little VRAM "). And stock games aren't the only things driving users to want more resources. So I don't think it's fair to say other uses "don't count" if that's something you're trying to suggest?

Because if so, that's not how it works. If, say, a current configuration I set up of Minecraft needs X amount of CPU, X amount of RAM, X amount of GPU processing, and X amount of VRAM, well... that's what I need then if I want to play it that way, right? Who's fault it is, is neither here nor there. It doesn't matter. The only thing that matters is if I want to play it that way, then that is what I need. So it becomes a justifiable need for whatever it is I am seeking. So I either get it, or I don't play it that way. Same as with everything. There's no alternatives.
emoticorpse 2023 年 4 月 22 日 下午 4:51 
引用自 emoticorpse
One of the main things I was getting at with that question is that, do you think it's fair to say that if "vanilla" Minecraft works fine at 1080p/1440p then the developers of Minecraft as well as Nvidia provided exactly what was needed for that game?.
For what purpose are you asking that question? I blame neither Mojang nor nVidia for the fact that my use with Minecraft needs more VRAM than I have, if that's why you're asking.

But nVidia doesn't provide hardware for games on a per game basis. They just sell hardware to consumers (and right now, one of those demands of many consumers is "you're providing too little VRAM "). And stock games aren't the only things driving users to want more resources. So I don't think it's fair to say other uses "don't count" if that's something you're trying to suggest?

Because if so, that's not how it works. If, say, a current configuration I set up of Minecraft needs X amount of CPU, X amount of RAM, X amount of GPU processing, and X amount of VRAM, well... that's what I need then if I want to play it that way, right? Who's fault it is, is neither here nor there. It doesn't matter. The only thing that matters is if I want to play it that way, then that is what I need. So it becomes a justifiable need for whatever it is I am seeking. So I either get it, or I don't play it that way. Same as with everything. There's no alternatives.

You are right, you didn't pick a side to blame. You were discussing the hardware aspect of this. I messed up. :steamfacepalm:
Illusion of Progress 2023 年 4 月 22 日 下午 6:57 
No worries, haha. I get it. It's easy for get things mixed up online over the course of a discussion.

Yeah, to me, there's little point in blaming things when it pertains to something I'm not completely sure of (which is why I posed the question I did to the people who were placing blame).

I just take a more practical approach. Is something I'm doing showing it is in need more of a certain something? Well, guess I need more of that certain something, then. It doesn't matter why. If I apparently need it, then I just need it.

In this particular case, I can not and will not go without my bushy Minecraft leaves! So if I have to feed the game VRAM for that, well, I guess I will, haha. Also, I'm really hoping I can turn the render distance up, turn a few more settings on or up higher, stay at 60 FPS a little more often, and maybe even do 1440p in case it comes to that later. And yes, I'm even wondering if a 6800 XT would be enough for that. I wasn't exaggerating when I said I don't think even an RTX 4090 could max this. And I'm not even looking to max it, nor am I using the most demanding shaders (or very high resolution resource packs). This game can apparently get silly with what it needs at times.
最後修改者:Illusion of Progress; 2023 年 4 月 22 日 下午 6:59
C1REX 2023 年 4 月 23 日 上午 12:52 
Have you guys heard about launch option -NOTEXTURESTREAMING?
Apparently it makes some games to load mora data into VRAM to improve performance and reduce stutters.
I personally can't test it out with my 8GB GPU as Returnal for example take 20GB VRAM with that option on.
5 mainstream GPUs with that amount of VRAM: 3090, 390Ti, 4090, 7900XT, 7900XTX


EDIT: This option is for UE4 games and make tittles like Returnal, Lost Ark, Fortnite, Borderlands run better. That includes texture popping.
最後修改者:C1REX; 2023 年 4 月 23 日 上午 3:18
AdahnGorion 2023 年 4 月 23 日 上午 4:02 
引用自 C1REX-PL
Have you guys heard about launch option -NOTEXTURESTREAMING?
Apparently it makes some games to load mora data into VRAM to improve performance and reduce stutters.
I personally can't test it out with my 8GB GPU as Returnal for example take 20GB VRAM with that option on.
5 mainstream GPUs with that amount of VRAM: 3090, 390Ti, 4090, 7900XT, 7900XTX


EDIT: This option is for UE4 games and make tittles like Returnal, Lost Ark, Fortnite, Borderlands run better. That includes texture popping.

Those are not mainstream GPU´s those are niche GPU´s all of them. Like under 5% of all gamers use one of the cards you listed.


Mainstream GPU´s atm is a 3060 (on steam alone, around 10.5% of all users have one, that means that one in every ten user on steam, has a 3060 specifically... This used to be the 1060, but after some generations, the avarage change to a new entry card.

The 3060 has 8 GB vram, something that 32% of steam users have (it is the biggest margin of vram and it is above the avarage vram as well. 22% have 6 gb and then we have the 12gb users at around 16%

Interesting enough 8% still use 4gb vram. only 1% has 24gb vram and only 0.6% have other (inbetween)

The 2060 and 1060 both have around 8% users.

The 3060ti and the 3070 do have alright representation with around 5%


Lets look at the card you call mainstream................


A 4090 has a representation of 0.25% of users..
A 3090 is 0.43%
The AMD cards you mention are under 0.10% here, with the older ones being a tad higher (AMD is not as popular as some people think and let alone for gaming. )

IE. AMD Radeon RX 580 is at 0.74%




High and coup de grace tier GPU´s are a tiny tiny tiny minority, they are not mainstream. Entry level and avarage GPU´s are..

What do I mean by entry level?

xx60 cards <-- that is entry level, avarage cards are like xx60ti and xx70
We also have many people with budget cards, like the xx50(ti) series. These are almost not even gaming cards in my opinion, but will play many things anyway (if your teenager just plays roblox and minecraft, they are perfectly fine)


My point here is.. Talking about the high tier GPU´s like they are mainstream is misleading, talking about above 8gb vram like a must have, is silly in the vast majority of cases..
As I said earlier, I am yet to see a game that demands more than 8gb vram to function and run at high/mid graphical settings (that is with a now entry level tier card) and on 1440p !!!


If you wanna play upcoming graphical heavy AAA releases in 4k and with (in my opinion silly ultra settings) then you surely will benefit from having a tad more vram, 12 gb would most likely be enough for that maybe 16 (a lot of the high tier nvidia gpus in current gen, have atleast 12gb and some 16 and 24)


But for the avarage user, playing dota, counter strike, roblox, fortnite, lost ark, GTA or Rust. I reckon that even a ancient machine, build around an ancient GPU (tech wise) would run them perfectly fine.. a 1060 will run all of these games just fine, a 3060 would handle all of them on higher settings with 1080p without issue.


My point here is, that there is no issue for the vast majority of people, this is a niche issue.. Even for someone as me, that sometimes buy some of the AAA titles, have no issue on 1440p with an entry level, granted, I understand the argument about getting better, I am building a new machine as well.. I understand I am part of the minority and that I will get a high mid or high tier GPU to build around.. But that has more to do with me being able to lavish a higher investment, than actual needs.

In the vast majority of games, I would be perfectly fine with what I have now, even upcoming AA and AAA titles.




At the end here, let me agree to one thing however. The 40th generation series is not a good generation, I reckon it is a avarage, maybe even below in terms of performance per euro, but maybe we underestimate all the software advances we have seen as well.


The 7th and 10th generation (to name some modern ones and not go all the way back to 2th generation) has been awesome!! My old mid tier card from the 7th generation lasted a long long time, sometimes bigger changes happen, sometimes not.

If we look at 20th generation, it was under avarage, but still fine enough to merit a buy (I had to upgrade around that point, so I took a chance and was happy with my card) but lets be honest, it was lower performance, because you paid for new tech (RT) capability. But none of those 20th generation cards, can pull off acceptable 1440p/4k RT compared to rasto.

RT is the future ofc (even if its been used for ages in ie. animation) it is so much easier and once it catches up with the Rasto, it will become used much more.
The thing about rasto (rasterazation) is that its been in the game (pun intended) for so long, that developers have developed so many ways to "cheat" with ie. lightning, shadows, etc. for a low cost performance, that most would not notice yet, unless the game is heavy in RT.



At the end of the day, the avarage gaming needs no more than 8gb vram, neither in many years, if you play many different games and do get AAA titles for 1440p or 4k, then obviously I am not gonna advice you to buy the cheapest, I will advice you to get whatever you can at higher tier, that still balance the machine... My point however is, that we are a minority and the vram talk I see, has blown up..

I reckon that is because some games are awfully optimized, I´ve seen some people claim that the only issue comes with sloppy console ports (I don´t play many of those) but in general, that is not the GPU´s fault then, its the developers fault, don´t support a practice like that.


Steam hardware survey
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam?utm_source=google&utm_campaign=20005915143&utm_medium=ad&utm_content=&utm_term=
C1REX 2023 年 4 月 23 日 上午 4:51 
That’s a long post. What I meant by mainstream is these are not professional cards with 48GB of memory costing few thousand dollars.
最後修改者:C1REX; 2023 年 4 月 23 日 上午 4:53
emoticorpse 2023 年 4 月 23 日 上午 5:11 
No worries, haha. I get it. It's easy for get things mixed up online over the course of a discussion.

Yeah, to me, there's little point in blaming things when it pertains to something I'm not completely sure of (which is why I posed the question I did to the people who were placing blame).

I just take a more practical approach. Is something I'm doing showing it is in need more of a certain something? Well, guess I need more of that certain something, then. It doesn't matter why. If I apparently need it, then I just need it.

In this particular case, I can not and will not go without my bushy Minecraft leaves! So if I have to feed the game VRAM for that, well, I guess I will, haha. Also, I'm really hoping I can turn the render distance up, turn a few more settings on or up higher, stay at 60 FPS a little more often, and maybe even do 1440p in case it comes to that later. And yes, I'm even wondering if a 6800 XT would be enough for that. I wasn't exaggerating when I said I don't think even an RTX 4090 could max this. And I'm not even looking to max it, nor am I using the most demanding shaders (or very high resolution resource packs). This game can apparently get silly with what it needs at times.

I think what confused me was (if I'm not mistaken), you were defending NOT blaming developers, but I don't remember you vocally defend NOT blaming Nvidia.

If you have said "don't blame Nvidia", then ok. But, I just thought it was one sided to say "not fair to blame developers", while not saying "don"t blame Nvidia".

If I'm wrong on any of this, I do apologize.
AdahnGorion 2023 年 4 月 23 日 上午 7:14 
引用自 C1REX-PL
That’s a long post. What I meant by mainstream is these are not professional cards with 48GB of memory costing few thousand dollars.

Then you should not use the word "Mainstream" as it kinda leads to "commonware" and "What the majority wants and have"

It is true that these cards are avaliable to the public and that they are priced within a range, where somewhat rich enthusiast tech and gamers can obtain them.
Mainstream they are not. Mainstream cards have between 6 and 12gb vram currently.
C1REX 2023 年 4 月 23 日 上午 7:35 
引用自 Darkie
引用自 C1REX-PL
That’s a long post. What I meant by mainstream is these are not professional cards with 48GB of memory costing few thousand dollars.

Then you should not use the word "Mainstream" as it kinda leads to "commonware" and "What the majority wants and have"

It is true that these cards are avaliable to the public and that they are priced within a range, where somewhat rich enthusiast tech and gamers can obtain them.
Mainstream they are not. Mainstream cards have between 6 and 12gb vram currently.
I’m sorry for confusion.
< >
目前顯示第 76-90 則留言,共 98
每頁顯示: 1530 50

張貼日期: 2023 年 4 月 20 日 上午 10:45
回覆: 98