Iggy Wolf 10/dez./2022 às 16:49
Nvidia 3060 8 GB scam
Unbelievable. As if it wasn't bad enough that they tried to pull this BS recently with the 4080 12 GB, now they're trying the same BS with the 3060. Kinda makes you wonder what crap they'll try to pull with the 4060. I still remember when the 970 fiasco resulted in a class action lawsuit. I guess they figure mainstream consumers who buy 3060s wouldn't notice the difference as well as enthusiasts who buy 80xx series cards. This is a glorified 3050 Ti, nothing more. https://videocardz.com/newz/nvidia-geforce-rtx-3060-with-8gb-memory-has-been-tested-17-performance-difference-vs-12gb-model

https://www.reddit.com/r/nvidia/comments/zahxs2/8gb_rtx_3060_same_name_same_price_less_performance/
< >
Exibindo comentários 106120 de 124
C1REX 11/abr./2023 às 23:45 
I don’t think it was an evil plan. NVIDIA simply provided what customers wanted. Many forum experts were convincing new buyers that 8GB will be fine for years.
New buyers saw no reason to pay extra for vram they didn’t need at that moment.

Let’s see how many people will think the same about 12GB being enough for future games.
UAB "Kriuk1s" 12/abr./2023 às 2:07 
true
UserNotFound 12/abr./2023 às 2:13 
Apparently, some think nVidia can do no wrong. They're even willing to lower ingame settings (including RT when necessary) to accommodate the 8GB VRAM. It's perfectly fine apparently....to these nVidia fanatics.
Heretic 12/abr./2023 às 2:25 
Escrito originalmente por UserNotFound:
Apparently, some think nVidia can do no wrong. They're even willing to lower ingame settings (including RT when necessary) to accommodate the 8GB VRAM. It's perfectly fine apparently....to these nVidia fanatics.
My Nvidia card has 12gb of GDDR6+. The AMD card I was tempted by used puny GDDR6.
UserNotFound 12/abr./2023 às 2:43 
Escrito originalmente por Heretic:
My Nvidia card has 12gb of GDDR6+. The AMD card I was tempted by used puny GDDR6.
Oh...kay. but when VRAM isn't enough and there is spillage to system memory, more VRAM, even "puny' GDDR6 is better than that, as exemplified by the VRAM debacle on the RTX 3060 Ti/3070/3070 Ti.

Just be glad you'd gone with the RTX 3080 12GB or the RTX 3080 Ti 12GB instead of the RTX 3080 10GB.
Corona Scurrae 12/abr./2023 às 7:55 
Escrito originalmente por smallcat:
What s so weird ? Less money less performance but budget friendly . 8GB VRAM is not bad for 1080p .
recent games beg to differ and in europe the pricing is practically the same. why would anyone save 20-30€ if they can get a superior piece of hardware instead.



Escrito originalmente por Iggy Wolf:
I honestly feel bad for anyone who wasted 900+ on ampere gpus and ada lovelace. unless you got a 4090 at 1600€, you have been ripped off by nvidia.

if the recent releases are anything to go by, then even a beast like the 4080 is incapable of 4k because of it's 16gb vram.

ofc we keep focusing on nvidia, despite the fact that they introduced rtx/io that would alleviate a lot of asset streaming issues but then again it takes a long time until it's widely adopted.
if I had to guess maybe 3-5 years from now. until then vram limitations are going to stay relevant
Komarimaru 12/abr./2023 às 11:10 
Sorry for the delay, had to uh.. get my account unblocked from forums first due to erroneous troll reports. Anywho!
Escrito originalmente por Illusion of Progress:
Escrito originalmente por Komarimaru:
And not just an Nvidia thing, either. AMD actually was worse for it in some cases, like the R9 Fury X(4GB) vs 980TI(6GB) If you had kept reading the thread, I agreed with the person.
I'm not sure why the goal post is being moved to "AMD does it too" now.

A claim was made like this was a new thing with nVidia and I was just saying it's not, because it's not. If you've rescinded the statement that it's a new thing, then never mind.
Escrito originalmente por Komarimaru:
And yes, there will always be crappier cards with less Vram with each generation. Why you don't buy the crappy cards and just save more, but no, people and fiscal responsibility is an unheard thing it seems.
I'm sorry but this reads similarly to a "just don't be poor" reasoning. It's like telling someone complaining about low wages in many jobs "well just avoid those jobs". I mean, sure... then maybe I in particular don't have those low paying jobs, if I can manage to do better than them to begin with... but it's conveniently skirting the issue that they are there.

You (should) know very well that the majority of the market isn't going to be able to manage the high end, because even in this fairy tale example where everyone could afford them, the more financially able ones could then afford even more and prices would just adjust to cancel that out, so this is the biggest non-answer if there ever was one.

Point is there's always going to be a disparity, and right now, the majority of the market will be dealing with this if they go with nVidia, simply because nVidia's options are VRAM lacking.

This is honestly close to making excuses for it at this point.

What I was saying, and if you look at GPU history, 8GB became the new normal. But AMD was the biggest culprit of low VRAM for a while, specifically to cut costs and corners. RX 480? 8GB version and... 4Gb version... wut? Gah! Same with the RX 580. And they were behind Nvidia in VRAM before then as well.

So, am not being a fanboy here. I've used both brands throughout the decades, am saying that the jumping on Nvidia thing seems silly, since no one expected games to miraculously not work on 8GB of Vram anymore.

As for the second part, you misunderstood me. I'm not doing the "Just don't be poor" and I never do that, ever. What I am saying is, be more intelligent with your purchase. If you can only afford an 6600XT right now, wait a bit and save and get the much much much better quality 6700XT.

See what I mean now? I'm not asking them to save up and get a RTX 4090 Suprim X AIO cooled from MSI. lol I'm saying, people are guilty of jumping on purchases without doing research first.

They get in this state of, I want it now, instead of just waiting a bit longer for something better.

Hopefully that clears up my stance.
hypercybermegatron 12/abr./2023 às 12:09 
NEWSFLASH
NVIDIA is itself is the scam.

Now AMD is following their example i.e. releasing top tier performance hardware first then trickling out lower and cheaper tier later.

They're all scammers at this point, for some crazy reason people keep giving them money!
xSOSxHawkens 12/abr./2023 às 13:01 
I have been pointing out for years that NV has historically be short changing people on VRAM.

I have pointed out for years that the knee cap that will nearly always take down mid-range or higher cards before anything else is lack of VRAM.

I have pointed out specific examples over the years multiple times, both in NV vs NV comparisons and NV vs AMD comparisons, showing that time and again the "lower VRAM" offerings from NV have been hobbled to the point of being eventually useless.

Examples include:

320mb 8800 vs 512/640/1024mb 8800's
4GB GTX 670 vs 2GB GTX 680
4GB GTX 960 vs 2GB GTX 960
3gb 1060 vs 6gb 1060

Historically when comparing NV vs AMD, AMD has nearly always offered higher or equal VRAM, and in the instances where the companies had splits with two options in capacity, it was generally higher on both (when NV was doing 320/640 AMD was 512/1024, when NV was running 3/6GB on their card, AMD offered 4/8GB, etc).

In more modern comparisons tossing in AMD we have things like the (relevant at time of release for the respective product lines) Far Cry 6. Not am amazing title by any means, but one which was ready out of the box to show that the 3080 10GB could indeed fall to the lowly RX-6800 non-XT once the VRAM was capped out...

When pointed out back then people bashed the game, as if the content mattered, and ignored the fact that out of the box the 10GB 3080 was already being presented with AAA titles that could knee-cap it on VRAM alone. No one listened when I said that would become a bigger issue...

Now we are seeing the fruit of NV skimping on VRAM. Not that I think they (NV) see anything wrong with it. All it will mean is an upgrade quicker than the consumer wants, and NV banks on the average user replacing their card with another of the same brand when its "needed". Simple.

Branding doesn't matter. If the card is mid range or higher, the one with more VRAM will always last longer for basic 60fps game-play.
Última edição por xSOSxHawkens; 12/abr./2023 às 13:01
Escrito originalmente por Komarimaru:
What I was saying, and if you look at GPU history, 8GB became the new normal. But AMD was the biggest culprit of low VRAM for a while, specifically to cut costs and corners. RX 480? 8GB version and... 4Gb version... wut? Gah! Same with the RX 580. And they were behind Nvidia in VRAM before then as well.
I'm not sure how it's relevant to the discussion that was occurring though.

To me, the discussion flow was, as started by you, "I'm not sure why this is only now an issue" and my response was "it's not only now an issue; it was an issue before but Pascal and the following generation mute people's attention to it" and now your response is "well AMD did it before too"?

Get my confusion now? Yes it's become a talking point recently, and for apparent reason, but the underlying discussion of nVidia being short on VRAM isn't new. It's just it only comes to a head when it's really bad, such as 8 GB on the RTX 3060 Ti/3070/3070 Ti, or the same happened with the GTX 680 or 780 (I forget which) with 3 GB VRAM.
Escrito originalmente por Komarimaru:
So, am not being a fanboy here. I've used both brands throughout the decades, am saying that the jumping on Nvidia thing seems silly, since no one expected games to miraculously not work on 8GB of Vram anymore.
Fair point here. People can't see the future which makes recommending for it maybe pointless. That's why the Golden rule is "buy the best you can afford, and use it as long as it lasts" and... sometimes it works and others it doesn't. So there were people who made a suggestion based on things at the time, and they don't deserve fault for those suggestions, per se.

But at the same time, as said above, this has happened before, and the market in the last half a year has largely comprised of the RTX 3060/3060 Ti/3070/3070 Ti below the high end, since the RTX 40 series only exists at near and above four figures, so for those not looking to spend as much, these products are very relevant to most people. And when the RX 6700 XT/6800/6800 XT exist, and the problem is coming to a head now, it's still a discussion worth having IMO.
Escrito originalmente por Komarimaru:
As for the second part, you misunderstood me. I'm not doing the "Just don't be poor" and I never do that, ever. What I am saying is, be more intelligent with your purchase. If you can only afford an 6600XT right now, wait a bit and save and get the much much much better quality 6700XT.

See what I mean now? I'm not asking them to save up and get a RTX 4090 Suprim X AIO cooled from MSI. lol I'm saying, people are guilty of jumping on purchases without doing research first.
That's fair then, but it was said rather broadly, and a lot of people do use it to say "step up to the high end only and never buy below that".

I do think there are sweet spots exist and it's often worth saving up to those, yes (and I'd recommend the 6700 XT as worth saving up to over the 6600 series, since price isn't super far apart and the former has 8 GB VRAM too).
Escrito originalmente por Komarimaru:
They get in this state of, I want it now, instead of just waiting a bit longer for something better.

Hopefully that clears up my stance.
It does; I've known people like that.

It's like they will just spend everything from each payment and that is their spending limit. Money seemingly burns holes in their purses/wallets.
Escrito originalmente por xSOSxHawkens:
320mb 8800 vs 512/640/1024mb 8800's
3gb 1060 vs 6gb 1060
Fair to point out that with these two examples, you're comparing different products.

The 8800 with 320 MB and 640 MB were the GTS.

The 512 MB (wasn't aware of a 1024 MB one) was the GT, which was ironically faster of a GPU outright, almost matching the GTX/Ultra.

And the GTX 1060s were just different GPUs too, even though they carried the same name.

That's actually a different discussion altogether, where nVidia names stuff the same that isn't the same. But the GPUs that were the same with different VRAM amounts were fine IMO; no foul play there. It gave you choice. Today, though, no choice to say "I want this GPU with more VRAM".
🦜Cloud Boy🦜 12/abr./2023 às 14:36 
Escrito originalmente por Illusion of Progress:
Yes, and?

And, it shows that how poor AMD's real situation is in the market.
Yesterday's Intel Arc GPU achieved same sales numbers as AMD GPUs.
Escrito originalmente por 🦜Cloud Boy🦜:
Escrito originalmente por Illusion of Progress:
Yes, and?

And, it shows that how poor AMD's real situation is in the market.
Yesterday's Intel Arc GPU achieved same sales numbers as AMD GPUs.
Annnnnd?

Are you being obtuse or what? Come on, you're smarter than this so don't pretend otherwise. I know a lot of people give you a hard time for some points you make but I credit you a bit more. You're smart. Which makes the points you tend to make like this confusing to me.

Like I said, a lot of people know nVidia has a market share (and mind share) monopoly, and that AMD has the opposite issue in the GPU space, to the tune that even newcomer Intel has a market share competing with AMD (though Intel arguable still has the same mindshare thing going on over AMD but with CPUs).

You pointing this out for its own sake is what I called out.

I ask again, why?

You come into a broader discussion about nVidia and offering low VRAM amounts, and just went "AMD has low market share", and then you're going to sit there pretend to be tone deaf? Like, really?

You bring that particular point into unrelated discussions time, and time, and time again. And it's why I finally called it out.

In other words, time and place. You can make a correct point, which you did, but still be out of place. When you just bring it up for its own sake, you come off as just having your feathers rustled and needing to throw bias around in order to feel better. Normally I don't jump to that conclusion when people make a positive or negative opinion for or against a given brand, but I mean... when you do it all the time? Whether it's relevant or not? Yeah...
🦜Cloud Boy🦜 12/abr./2023 às 15:37 
I always hate Nvidia fanboys, especially those who blindly support/ defend Nvidia without much of a valid logic. Fanboy-ism is bad for the market.

Btw, Nvidia released 4070 today with 12 GB VRAM, it has 3080 like performance, msrp $600.
That means, Nvidia identified their mistake about VRAM, and headed for the right direction.
Última edição por 🦜Cloud Boy🦜; 12/abr./2023 às 15:56
They were just the 8800 GTS 640 MB with half the VRAM, so there was nothing about them that made them designed with it in mind.

Due to them being cheaper, more people may have gotten them for that though, yes. I think a pair of them may have been cheaper than the GTX or Ultra while being faster (at least in best case scenarios).
Última edição por Illusion of Progress; 12/abr./2023 às 19:49
< >
Exibindo comentários 106120 de 124
Por página: 1530 50

Publicado em: 10/dez./2022 às 16:49
Mensagens: 124