Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
New buyers saw no reason to pay extra for vram they didn’t need at that moment.
Let’s see how many people will think the same about 12GB being enough for future games.
Just be glad you'd gone with the RTX 3080 12GB or the RTX 3080 Ti 12GB instead of the RTX 3080 10GB.
I honestly feel bad for anyone who wasted 900+ on ampere gpus and ada lovelace. unless you got a 4090 at 1600€, you have been ripped off by nvidia.
if the recent releases are anything to go by, then even a beast like the 4080 is incapable of 4k because of it's 16gb vram.
ofc we keep focusing on nvidia, despite the fact that they introduced rtx/io that would alleviate a lot of asset streaming issues but then again it takes a long time until it's widely adopted.
if I had to guess maybe 3-5 years from now. until then vram limitations are going to stay relevant
What I was saying, and if you look at GPU history, 8GB became the new normal. But AMD was the biggest culprit of low VRAM for a while, specifically to cut costs and corners. RX 480? 8GB version and... 4Gb version... wut? Gah! Same with the RX 580. And they were behind Nvidia in VRAM before then as well.
So, am not being a fanboy here. I've used both brands throughout the decades, am saying that the jumping on Nvidia thing seems silly, since no one expected games to miraculously not work on 8GB of Vram anymore.
As for the second part, you misunderstood me. I'm not doing the "Just don't be poor" and I never do that, ever. What I am saying is, be more intelligent with your purchase. If you can only afford an 6600XT right now, wait a bit and save and get the much much much better quality 6700XT.
See what I mean now? I'm not asking them to save up and get a RTX 4090 Suprim X AIO cooled from MSI. lol I'm saying, people are guilty of jumping on purchases without doing research first.
They get in this state of, I want it now, instead of just waiting a bit longer for something better.
Hopefully that clears up my stance.
Now AMD is following their example i.e. releasing top tier performance hardware first then trickling out lower and cheaper tier later.
They're all scammers at this point, for some crazy reason people keep giving them money!
I have pointed out for years that the knee cap that will nearly always take down mid-range or higher cards before anything else is lack of VRAM.
I have pointed out specific examples over the years multiple times, both in NV vs NV comparisons and NV vs AMD comparisons, showing that time and again the "lower VRAM" offerings from NV have been hobbled to the point of being eventually useless.
Examples include:
320mb 8800 vs 512/640/1024mb 8800's
4GB GTX 670 vs 2GB GTX 680
4GB GTX 960 vs 2GB GTX 960
3gb 1060 vs 6gb 1060
Historically when comparing NV vs AMD, AMD has nearly always offered higher or equal VRAM, and in the instances where the companies had splits with two options in capacity, it was generally higher on both (when NV was doing 320/640 AMD was 512/1024, when NV was running 3/6GB on their card, AMD offered 4/8GB, etc).
In more modern comparisons tossing in AMD we have things like the (relevant at time of release for the respective product lines) Far Cry 6. Not am amazing title by any means, but one which was ready out of the box to show that the 3080 10GB could indeed fall to the lowly RX-6800 non-XT once the VRAM was capped out...
When pointed out back then people bashed the game, as if the content mattered, and ignored the fact that out of the box the 10GB 3080 was already being presented with AAA titles that could knee-cap it on VRAM alone. No one listened when I said that would become a bigger issue...
Now we are seeing the fruit of NV skimping on VRAM. Not that I think they (NV) see anything wrong with it. All it will mean is an upgrade quicker than the consumer wants, and NV banks on the average user replacing their card with another of the same brand when its "needed". Simple.
Branding doesn't matter. If the card is mid range or higher, the one with more VRAM will always last longer for basic 60fps game-play.
To me, the discussion flow was, as started by you, "I'm not sure why this is only now an issue" and my response was "it's not only now an issue; it was an issue before but Pascal and the following generation mute people's attention to it" and now your response is "well AMD did it before too"?
Get my confusion now? Yes it's become a talking point recently, and for apparent reason, but the underlying discussion of nVidia being short on VRAM isn't new. It's just it only comes to a head when it's really bad, such as 8 GB on the RTX 3060 Ti/3070/3070 Ti, or the same happened with the GTX 680 or 780 (I forget which) with 3 GB VRAM.
Fair point here. People can't see the future which makes recommending for it maybe pointless. That's why the Golden rule is "buy the best you can afford, and use it as long as it lasts" and... sometimes it works and others it doesn't. So there were people who made a suggestion based on things at the time, and they don't deserve fault for those suggestions, per se.
But at the same time, as said above, this has happened before, and the market in the last half a year has largely comprised of the RTX 3060/3060 Ti/3070/3070 Ti below the high end, since the RTX 40 series only exists at near and above four figures, so for those not looking to spend as much, these products are very relevant to most people. And when the RX 6700 XT/6800/6800 XT exist, and the problem is coming to a head now, it's still a discussion worth having IMO.
That's fair then, but it was said rather broadly, and a lot of people do use it to say "step up to the high end only and never buy below that".
I do think there are sweet spots exist and it's often worth saving up to those, yes (and I'd recommend the 6700 XT as worth saving up to over the 6600 series, since price isn't super far apart and the former has 8 GB VRAM too).
It does; I've known people like that.
It's like they will just spend everything from each payment and that is their spending limit. Money seemingly burns holes in their purses/wallets.
The 8800 with 320 MB and 640 MB were the GTS.
The 512 MB (wasn't aware of a 1024 MB one) was the GT, which was ironically faster of a GPU outright, almost matching the GTX/Ultra.
And the GTX 1060s were just different GPUs too, even though they carried the same name.
That's actually a different discussion altogether, where nVidia names stuff the same that isn't the same. But the GPUs that were the same with different VRAM amounts were fine IMO; no foul play there. It gave you choice. Today, though, no choice to say "I want this GPU with more VRAM".
And, it shows that how poor AMD's real situation is in the market.
Yesterday's Intel Arc GPU achieved same sales numbers as AMD GPUs.
Are you being obtuse or what? Come on, you're smarter than this so don't pretend otherwise. I know a lot of people give you a hard time for some points you make but I credit you a bit more. You're smart. Which makes the points you tend to make like this confusing to me.
Like I said, a lot of people know nVidia has a market share (and mind share) monopoly, and that AMD has the opposite issue in the GPU space, to the tune that even newcomer Intel has a market share competing with AMD (though Intel arguable still has the same mindshare thing going on over AMD but with CPUs).
You pointing this out for its own sake is what I called out.
I ask again, why?
You come into a broader discussion about nVidia and offering low VRAM amounts, and just went "AMD has low market share", and then you're going to sit there pretend to be tone deaf? Like, really?
You bring that particular point into unrelated discussions time, and time, and time again. And it's why I finally called it out.
In other words, time and place. You can make a correct point, which you did, but still be out of place. When you just bring it up for its own sake, you come off as just having your feathers rustled and needing to throw bias around in order to feel better. Normally I don't jump to that conclusion when people make a positive or negative opinion for or against a given brand, but I mean... when you do it all the time? Whether it's relevant or not? Yeah...
Btw, Nvidia released 4070 today with 12 GB VRAM, it has 3080 like performance, msrp $600.
That means, Nvidia identified their mistake about VRAM, and headed for the right direction.
Due to them being cheaper, more people may have gotten them for that though, yes. I think a pair of them may have been cheaper than the GTX or Ultra while being faster (at least in best case scenarios).