xSOSxHawkens 2023 年 4 月 20 日 上午 10:45
Time is not being kind to nVidia and its VRAM hoarding...
https://www.techspot.com/article/2661-vram-8gb-vs-16gb/


引用自 "At a 1080p Rez..."

As we've seen a number of times now, the issues for the RTX 3070 start when we enable ray tracing. You could claim that the ultra quality preset is too much, but you'd also have to ignore the fact that the Radeon 6800 is good for 73 fps, while the 3070 is unusable thanks to its 8GB VRAM buffer. Worse still, the 6800 is still able to deliver highly playable performance at 1440p.

引用自 "In Hogwarts Legacy"
With the RTX 3070 we could notice the occasional jarring frame stutter, but the bigger issue is the image quality. The 3070 is constantly running out of VRAM and when it does the game no longer runs into crippling stuttering that lasts multiple seconds, instead all the textures magically disappear and then reappear at random. What you're left with is a flat horrible looking image and this occurs every few seconds. Even when standing completely still, the RTX 3070 keeps cycling textures in and out of VRAM as it simply can't store everything required for a given scene.

But what about DLSS?

If we enable upscaling (DLSS and FSR quality modes) for the RTX 3070 and the Radeon 6800, frame rate performance certainly improves but unfortunately the RTX 3070 just doesn't have enough VRAM, even when upscaling from a 720p image and we see the constant popping in and out of textures. Performance is also much worse on the 3070 when compared to the 6800, we're also still seeing stuttering, it's just a mess.

The next step is to dial down the quality preset to high, with high ray tracing, after all we don't typically recommend the ultra settings in most games. But even here the RTX 3070 suffers from regular frame stutter and in some instances severe frame stuttering.

Dropping down to the medium preset with ray tracing still set to high is a much improved experience with the RTX 3070, and texture quality is no longer an issue, though there is still the occasional frame stutter as VRAM usage is right on the edge and occasionally spills over.

__________________________________________________________________________

Personal take:

I have said for a long time that retaliative performance being close, two GPU's with differing VRAM capacities will have their usable life determined by that capacity, and thus more VRAM is worth it within reason.

Since the RTX3k/RX6k release I was critical of nvidia and its choice to be stingy and hoard VRAM and not put it on their cards. I regularly posited that the 16GB cards from AMD starting at the 6800 non-xt would offer better long term value than the 8 and 10 GB cards from the NV line up. I thought both back then, and now (and rightfully so it would seem) that spending money on something like a 3070 was an absolute waste. Seems it was indeed.

I pointed out when the 6500xt was out that the card itself was OK if you stuck under the VRAM, but was criticized that no one, even on an *entry* level card should have to be managing VRAM usage like that and having to optimize settings to get playable frames on hardware (again, even though it was entry). I pointed out that it would not be long before the same was having to be done with anything with less VRAM than it has usable core power (looking at you 8/10GB modern cards from NV).

Now upper mid range $500 class GPU's are having to compromise hard on settings, or simply live with literal texture pop in due to lack of VRAM to use.

2023 is here. And the 8GB 3070 cant even keep up with a 6800 non-xt in multiple ray traced titles anymore due to VRAM limits. It cant run quality 1440p anymore at reasonable high refresh, and at 1080p, its so bad at times that the 6800's* 1% lows match the 3070's average.

And the smoking gun in all of this? The reason the 3070 is (by comparison) now* trash?

VRAM and nvidia being unwilling to put it on the card without pushing costs up.

To be clear, both companies have cards with 8GB of VRAM, but one has 8gb cards positioned as high end, or upper mid range, both in price and product stack, while the other has the cards listed as mid range or lower and priced accordingly.

At the end of the day the only real people who lose out here are NV owners of 8 (and soon enough) 10 GB cards... While the only actual winner is NV.

Take a bow Jensen, you earned it.

* : Edits
最後修改者:xSOSxHawkens; 2023 年 4 月 20 日 上午 11:04
< >
目前顯示第 16-30 則留言,共 98
76561198343548661 2023 年 4 月 20 日 下午 10:07 
As far as i remember many of my games have presets . But let s get real , NVidia should put more VRAM to some of their cards .
最後修改者:smallcat; 2023 年 4 月 20 日 下午 10:07
Holografix 2023 年 4 月 20 日 下午 10:08 
引用自 smallcat
As far as i remember many of my games have presets . But let s get real , NVidia should put more VRAM to some of their cards .
yah true. but Nvidia controls the market in spite of consumer pressure. They do what they want.

nvidia even sells their cards to undercut their 3rd party vendor partners. Remember EVGA? lol
最後修改者:Holografix; 2023 年 4 月 20 日 下午 10:09
Komarimaru 2023 年 4 月 20 日 下午 10:12 
引用自 Holografix
引用自 smallcat
As far as i remember many of my games have presets . But let s get real , NVidia should put more VRAM to some of their cards .
yah true. but Nvidia controls the market in spite of consumer pressure. They do what they want.

nvidia even sells their cards to undercut their 3rd party vendor partners. Remember EVGA? lol
To be fair, few thought 8gb of VRAM would run out this far for mid and low ranged cards

EVGA though was never near msrp, ever. All their cards were miles ahead in performance and options with custom pcb, bios etc.
Holografix 2023 年 4 月 20 日 下午 10:17 
引用自 Komarimaru
引用自 Holografix
yah true. but Nvidia controls the market in spite of consumer pressure. They do what they want.

nvidia even sells their cards to undercut their 3rd party vendor partners. Remember EVGA? lol
To be fair, few thought 8gb of VRAM would run out this far for mid and low ranged cards

EVGA though was never near msrp, ever. All their cards were miles ahead in performance and options with custom pcb, bios etc.
and nvidia ran evga out of the market. quality doesn't matter, if Nvidia can control the market
最後修改者:Holografix; 2023 年 4 月 20 日 下午 10:18
Komarimaru 2023 年 4 月 20 日 下午 11:34 
引用自 Holografix
引用自 Komarimaru
To be fair, few thought 8gb of VRAM would run out this far for mid and low ranged cards

EVGA though was never near msrp, ever. All their cards were miles ahead in performance and options with custom pcb, bios etc.
and nvidia ran evga out of the market. quality doesn't matter, if Nvidia can control the market
Ya, we don't know why evga fully left, we know some things and I'll miss them greatly since who I've bought from due decades.

I still don't know which current brands are worthwhile, they all have hella flaws
UserNotFound 2023 年 4 月 21 日 上午 12:36 
引用自 Komarimaru
To be fair, few thought 8gb of VRAM would run out this far for mid and low ranged cards

EVGA though was never near msrp, ever. All their cards were miles ahead in performance and options with custom pcb, bios etc.
Actually, a few reviewers had said that, while the RTX 3070 series were powerful cards, they were concerned that 8GB VRAM may prove to be insufficient (for their reviews of the RTX 3070/3070 Ti), I remember that Steve of HUB had said that, at the time of his review of the just released then RTX 3070 cards (circa 2021), that 8GB VRAM would prove to be an issue within 2-3 years. I think it was him, had read so many articles that I could have mistaken him for someone else though.

And about eVGA, some of us AMD boys were hoping they'd jump ship and bring their eVGAness to AMD cards, but alas, that was not to be.
Illusion of Progress 2023 年 4 月 21 日 上午 2:59 
引用自 ZeekAncient
Unfortunately, during the GPU shortage, when I was in the process of building a new PC, Vram amount was never really a concern. It felt at the time that hardly any games were really exceeding 8GB that much, and even at 4K, it felt that 8GB wasn't going to be a limiting factor. In fact, it almost felt like AMD's GPUs had way more than they would ever need, and it was just going to go to waste, lol. Sure, lmao.
To be fair, that wouldn't have been a rare impression to have at the time.

VRAM needs tend to not go up gradually in a flat line, but they tend to jump up shortly after console releases. I presume you understand why. And that jump really didn't happen until after the launch of those products. Notice the last time this happened to a severe extent (Kepler) was when new consoles were releasing too. nVidia would not have been ignorant that a big jump in VRAM was soon to come, so this was not an accident or something overlooked.

There's also the fact that nVidia dominates in market share, and much of their products tend to lack of VRAM, so the lower amounts felt more normalized.

So having the opinion that "yeah, nVidia could be higher, but it's enough" and "AMD just has more than needed and it's a superfluous selling point" might have seemed rational at the time.

I seriously wonder how much less of an issue some of these modern games would be if the amounts of VRAM on the market weren't so low. I think these games are just exposing that factor. Obviously there's other issues with some of these games (though PC gamers might want to get used to dealing with either shader compilation times at startup or stutter during gameplay, because that's not necessarily one of them, and is instead a result of PCs not being fixed platforms like the consoles). So I'm not saying the games didn't have their issues, but I do feel VRAM is just one of them making it worse.

The recent games we've been seeing have only been the first of these. Now obviously the RTX 3070 series is a bit older now, but it has only JUST been replaced, and there were instances of an RTX 3060 outperforming not just an RTX 3070 but an RTX 3080 (!) due to VRAM. For that to ever happen, at all, before a card is even formally replaced, is shocking. It's not going to get better moving forward, unfortunately. My heart really does go out to RTX 3070/Ti buyers (and even 3060 Ti and 3080 to a point). Those products warranted more and the buyers deserved more, even if they chose to buy them as they were.

It's one thing to say a card not maxing everything out five years later is expected, but to have severely reduced impacts at times while it's still relevant in the current selling market, and has only recently been formally replaced (if you want to call it that, given the severe price hike the RTX 4070 Ti represented)? Different story altogether.
C1REX 2023 年 4 月 21 日 上午 3:15 
A copy-paste from another topic but it fits better here.

I refuse to support NVIDIA's strategy of artificially limiting VRAM. I would rather skip this generation or buy AMD.

Modern GPUs should have several times more memory than consoles. It used to be like that in the past. NVIDIA is using the same strategy as Apple and charging exorbitant prices for memory. VRAM doesn't cost $100 per 1GB.

Stutters and poor performing ports are partly due to NVIDIA's low VRAM standards. They set the bar insanely low and deceived us into believing that memory is incredibly expensive.

The only reason current amount of vram seems like enough is because nvidia gave developers no choice so it must be enough. They could add “deal with it, sucker” and “blame devs for bad optimisation, not us”
76561198343548661 2023 年 4 月 21 日 上午 3:32 
Guys , just read about Immortals of Aveum , min sys req

1080p, 60 FPS, low to med settings:

OS Windows 10 (64bit);
CPU Intel Core i7-9700 или AMD Ryzen 7 3700X;
GPU: NVIDIA GeForce RTX 2080 Super (8GB) или AMD Radeon RX 5800 XT (8GB)
16GB RAM;
DirectX: 12;
storage: 110GB ( SSD recommended).

what ?
Glad , i dont like it much
最後修改者:smallcat; 2023 年 4 月 21 日 上午 3:34
Illusion of Progress 2023 年 4 月 21 日 上午 3:39 
引用自 smallcat
Guys , just read about Immortals of Aveum , min sys req

1080p, 60 FPS, low to med settings:

OS Windows 10 (64bit);
CPU Intel Core i7-9700 или AMD Ryzen 7 3700X;
GPU: NVIDIA GeForce RTX 2080 Super (8GB) или AMD Radeon RX 5800 XT (8GB)
16GB RAM;
DirectX: 12;
storage: 110GB ( SSD recommended).

what ?
Glad , i dont like it much
Look at this, not much better.

https://store.steampowered.com/app/2124490/SILENT_HILL_2/

Minimum:
Requires a 64-bit processor and operating system
OS: Windows 10 x64
Processor: Intel Core i5-8400 | AMD Ryzen 3 3300X
Memory: 12 GB RAM
Graphics: AMD Radeon™ RX 5700 / NVIDIA® GeForce® GTX 1080
DirectX: Version 12
Storage: 50 GB available space
Sound Card: Windows Compatible Audio Device.
Additional Notes: Playing on minimum requirements should enable to play on
Low/Medium quality settings in FullHD (1080p) in stable 30 FPS.

Recommended:
Requires a 64-bit processor and operating system
OS: Windows 11 x64
Processor: Intel Core i7-8700K | AMD Ryzen 5 3600X
Memory: 16 GB RAM
Graphics: NVIDIA® GeForce® 2080RTX or AMD Radeon™ 6800XT
DirectX: Version 12
Storage: 50 GB available space
Sound Card: Windows Compatible Audio Device.
Additional Notes: Playing on recommended requirements should enable to play on
Medium quality settings in 60 FPS or High quality settings in 30 FPS, in FullHD
(or 4k using DLSS or similar technology).
And I do want this one, so it matters to me.
C1REX 2023 年 4 月 21 日 上午 3:49 
That Silent Hill specs requirements are substantially higher than for The Last of Us.
Komarimaru 2023 年 4 月 21 日 上午 3:56 
引用自 C1REX-PL
That Silent Hill specs requirements are substantially higher than for The Last of Us.
Ya but... look at TLoU1 runs atm... So rough on any mid ranged or lower GPU.
C1REX 2023 年 4 月 21 日 上午 4:02 
引用自 Komarimaru
引用自 C1REX-PL
That Silent Hill specs requirements are substantially higher than for The Last of Us.
Ya but... look at TLoU1 runs atm... So rough on any mid ranged or lower GPU.
I know. I own the game and finished it on a 8GB GPU. I blame nvidia and us, customers for harming pc market with this ridiculously low vram limit. Lesson learnt.
AdahnGorion 2023 年 4 月 21 日 上午 4:06 
引用自 xSOSxHawkens
引用自 Komarimaru
Why is this being posted, yet again though?
Because, while it has been a recent topic of general discussion in a few threads on here recently, all of that has been, for a lack of better terms, 'he said she said'. It was all banter between ramdo's online.

This is a fresh, written this week, fairly unbiased look at the subject from a professional and respected tech publication that goes into dept on both the past and present performance, the way those things have changed over time, and the direct and indisputable impact the single difference of VRAM on otherwise comparable cards makes.

Or more simply, it was posted because its legit, its fresh, and its on topic with a general trend in conversation at this time. And I thought it was a good read worth sharing ;)



引用自 Darkie
...

The 8gb ones would last for a long time, if you only use 1080p.. The real issue is if you go 1440p or 4k (I would never get 8gb vram gpu for a new machine, when I play 1440p)
...
A 4060 is not high end.. its a entry level card.

I dont think you read the article... The 8GB cards are already falling in 1080p at this point, at least for people wanting to play the games at top settings. 8GB will work for some time, but there is a difference between working and working as they *should/could* have. I typically like to see AMD win, but in this case I actually wish NV had done right. The 3070 and 70ti are both cards that should reasonably trounce the 6800 with RT in play, even though they should be behind else-wise. The fact that even at 1080p the Radeons can out Ray Trace the RTX card at this point, due to nothing but VRAM constraint, is frankly insane. The thought of VRAM causing texture pop @1080p on a 500 buck GPU thats two years old?... What is being smoked by anyone thinking thats OK?

Its one thing to accept those compromises in a budget offering, its another to accept them from offerings that are squarely upper mid-range or lower high end (x70 class cards).

But I would be pissed if I had to do that with the mainline card, which at that time was the 6800 non-xt.

And to even begin to argue that an x70 class card should be so out dated in one single generation to see texture pop is a bit silly. At worst last gen's lower high end x70 translates into this gens upper end mid range as shown via your own 4060 comparison.

And a 4060 is far from entry level my friend. Again, you can think what you want but I use standards and form opinions from the rest of the industry...

引用自 TomsHardware
The RTX 4070 is either the bottom of the high-end GPU range, or the top of the mainstream segment, depending on how you want to classify things.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4070-review/4

(note review isnt relevant much to the topic at hand outside of being used as point of reference for the positioning of x70 class GPU products in retaliative industry accepted hierarchy).

x70 class cards either represent bottom of the top (making x60 the top of the main stream/mid range) or they are the top of the mid range making x60 the secondary best in mid range. And thankfully they have learned and given the x70's a more reasonable VRAM allocation.

Entry is x30 or x50 from nvidia.






引用自 ZeekAncient
snip

Great reply and thank you for a detailed write up. Yeh, the 70ti *really* got the short stick... It has so much room to breath.So core heavy, so VRAM limited. Thankfully with some careful tweaking in game settings you can keep it in kit and keep the core fed well for awhile ;)




引用自 Heretic
AMD marketing. They supply cheap vram and fool people into believing their cards will last forever. I've had no problems with my 12gb card. It runs VR and my 3440x1440 144hz IPS display well enough. Of course, I don't expect it to last forever and TBH the recent wave of buggy console ports hasn't inspired me into investing into the future on some wild hype.

As others have noted... Do you not understand the difference in 12GB and 8GB on VRAM?

I mean, the article (quite correctly I assume we will see) possits that the 3060 12GB is going to reaonably outlast the 3070/70ti due to it having usable VRAM alone. I am sure that will happen. It happened already many times in the past (3GB 680 vs 4GB 670, 2GB 960 vs 4GB 960). It will play out again.

No matter what 12GB card you have from NV or AMD (outside of odd super low end exceptions) it will probably outlast a 3070/70ti for usable gaming performance. Let that sink in, more so depending on the card you use...

Yes, eventually we will all see all our cards get too old, that is not the point of the conversation. The point is that these cards, that were just last gen upper midge range or lower high end, are now in some cases borderline unusable 1-2 years later due to VRAM being too limited. That is very different from a card simply getting too old after many years and advancements in tech. Instead this was a train wreck many saw two years off and wondered... why?...

The article specifically mention 1440p is the issue and that unbalanced bottlenecked PC´s that wanna run ultra with a entry level and low medium card are in trouble.. well OFC !!!! you are trying to gain 3% extra visual gain, for the cost of 50% resources spend.

Anyone that plays 1080p and mix medium/high/very high, will have no issue in 99% of all games released the next 4 years with a current gen entry level card (4060 ie)
C1REX 2023 年 4 月 21 日 上午 4:16 
引用自 Darkie

The article specifically mention 1440p is the issue and that unbalanced bottlenecked PC´s that wanna run ultra with a entry level and low medium card are in trouble.. well OFC !!!! you are trying to gain 3% extra visual gain, for the cost of 50% resources spend.

Anyone that plays 1080p and mix medium/high/very high, will have no issue in 99% of all games released the next 4 years with a current gen entry level card (4060 ie)
Why so emotional.
If the card is able to do 1440p and the only limiting factor is vram then the argument about imbalance sounds valid.
< >
目前顯示第 16-30 則留言,共 98
每頁顯示: 1530 50

張貼日期: 2023 年 4 月 20 日 上午 10:45
回覆: 98