Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
Get a 6800xt or 6950xt, worth saving up that little bit more.
The RX 6800 XT was launched at 649USD against the RTX 3080 at 699USD. So, IF I were to acquiesce and accept your argument, that's mean the 6800 (579USD MSRP) would have to be compared against the RTX 3070 Ti (599USD MSRP), which would be even worse, don't you agree?
At that time, many went for the RTX 3070 Ti simply because of RT prowess, and perhaps some bought it for productivity purpose as well. Although HadrwareUnboxed did warn its viewers that the 8GB VRAM buffer on the RTX 3000 series cards would sooner or later become an issue.
Would you not agree that an RTX 3070 Ti (>600USD) against the RX 6800 (510USD) would skew the comparison even more badly?
Ya no reason to buy Nvidia below a 3080 at all.
AMD 6700xt and up for pricing. Avoid anything below that imho.
Only now?
People have been calling nVidia out for being short on VRAM for years. This is their thing. It might not be an issue for you if you live at the high end, but then you're out of touch with the situation for most people.
Pascal was a one-off in that it was VRAM blessed and made people forget for a while.
It doesn't matter what arbitrary terms you label the RTX 3070 as. It simply never should have had an amount of VRAM that should have ONLY been on the entry level RTX 3050. The RTX 3060 Ti having that much was already disappointing. It's planned obsolescence and nothing more.
Both before Pascal, and now again in recent years, the short VRAM offerings are coming to attention.
This is absolutely not new.
For many consumers, products tend to compete more in what they offer as a collective for the price, not performance in a vacuum. You don't have to like it but it's a far more practical metric for many people.
Also, in rasterized performance, the 6800 XT is the RTX 3080 competitor. The 6800 (non XT) is more the RTX 3070 Ti competitor. But then the nVidia option has ray tracing going for it, and AMD lowers prices to try and gain sales (shocker...). So, yeah, the two cards often get compared for a good reason. Trying to imply people can't do that because you want them to compare on performance in a vacuum, or some arbitrary and entirely meaningless thing like its tier or number label, is confusing to me.
Don't know what to tell you then.
Performance in a vacuum is a meaningless metric. Tier is an arbitrary and meaningless thing. We live in a world that revolves around money. So performance for price matters to some people.
And no idea why you brought Pascal Era into this, AMD and Nvidia maxed at 8GB then.
And during Navi and 20XX series cards, 8GB.
So noooo People were not talking about it then... Literally a trend that just started...
Pascal was mentioned because you stated "this is only being being brought up now" yet it's not only being brought up now unless your focus is only on the last few generations. Pascal was mentioned because it was a VRAM blessed generation for the most part, so thus during and right after it, this wasn't so much a thing. But it is now, and similarly, go back before Pascal (especially to Kepler, but to some degree Fermi and Maxwell), and it is a thing.
And yes, there will always be crappier cards with less Vram with each generation. Why you don't buy the crappy cards and just save more, but no, people and fiscal responsibility is an unheard thing it seems.
If AMD and Nvidia cards were priced same, nobody would buy AMD. For example: if RTX 3080 and RX 6800xt both were selling for $700, nobody would buy 6800xt.
AMD cuts their GPU prices because it's the ONLY viable option they have.
A claim was made like this was a new thing with nVidia and I was just saying it's not, because it's not. If you've rescinded the statement that it's a new thing, then never mind.
I'm sorry but this reads similarly to a "just don't be poor" reasoning. It's like telling someone complaining about low wages in many jobs "well just avoid those jobs". I mean, sure... then maybe I in particular don't have those low paying jobs, if I can manage to do better than them to begin with... but it's conveniently skirting the issue that they are there.
You (should) know very well that the majority of the market isn't going to be able to manage the high end, because even in this fairy tale example where everyone could afford them, the more financially able ones could then afford even more and prices would just adjust to cancel that out, so this is the biggest non-answer if there ever was one.
Point is there's always going to be a disparity, and right now, the majority of the market will be dealing with this if they go with nVidia, simply because nVidia's options are VRAM lacking.
This is honestly close to making excuses for it at this point.
Yes, yes, we all know because you're always jumping at every opportunity to point out how AMD is the more unpopular brand even when they have better price to performance. We know.
The sad thing is, it also reflects a poor market state full of nVidia mind share. Appealing to the masses isn't a very good argument.
I like how your comparison is the RTX 3080 and RX 6800 XT when they are somewhat equal only in rasterized performance but not ray tracing (which favors the RTX 3080 obviously) so I'd say that puts the RTX 3080 ahead in average total performance, no? So it would be expected that people would choose it if they cost the same (and this is even if we ignore the whole brand mind share nVidia has going on too).
But to answer your question, personally, if both cost $550 (I'd avoid either at $700), I'm going with the RX 6800 XT every single time because the RTX 3080 has what I feel is a compromising about of VRAM going forward, and ray tracing isn't super important to me right now. But I admit I'm likely in the minority and that most people would eat the RTX 3080 up.
If they were priced the same and the RTX 3080 had 16 GB instead, even if it instead lost its ray tracing advantage, then I'd probably choose it instead. Because despite everything I've said in this post thus far, in a 1:1 comparison I'd probably prefer nVidia just a hair if we're only looking at hardware and not the company. That's precisely why I'm critical about what I feel is a major issue with them in recent times (poor value and lacking VRAM unless you spend at the top). If you want to see something be better, you can't blindly look at its good sides but you'll call attention weak spots that could be better.
Even the Intel Arc GPUs are Catching Up with AMD's Discrete GPU Market Share.
Intel brand name speaks.
https://www.tomshardware.com/news/intel-looks-to-be-catching-up-with-amd-discrete-gpu-matket-share