Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I get why you did it. The "base" x60 feels more like an "LE" (or lesser) model and the "Ti" feels more like the "real, base" x60 model now, especially with the RTX 30 and 40 series. But I still would have tried to just do a like for like to keep people from being able to wave your argument away because they'll think you're using indirect comparisons to exaggerate a point.
Nit picking would be possible no matter what kind of comparison I made. Even if I gave the best possible example, someone would probably reduce it to "but the newer one weighs less so it won't bent the connector"
^ this shows you don't understand how GPU architecture works. The memory bandwidth is a function of the memory and bus width, and the memory bus width is a function of the GPCs (speaking of NVIDAs GPUs). You are the one who is solely looking at a specific number/metric in isolation rather than looking at how the architecture works as a whole. Just as you are looking at "number of cores" as a metric without consideration of what those cores are / how they are different. The SMs on 30 series vs the SMs on 20 series have half of the shader cores able to function as either Int32 or fp32; when needed a 30 series card can have twice the fp32 performance compared to a 20 series card with the same exact number of SMs.
Your logic is the equivalent of comparing a Core 2 Quad Q6600 to a current generation Pentium Gold G7400T and saying the Core 2 Quad is better because "it has 4 cores, and the Pentium Gold only has 2 cores". While back in reality, the Pentium Gold G7400T is about 230% faster in threaded performance and more than 400% faster in single threaded performance.
As nullable said, you are complaining about them doing more with less.
Again, I could be wrong, I'm no expert, but the benchmarks show that they seem to be doing equal-or-less with less rather than more-with-less. Adding salt to injury, the 30 and 40 series draws 20-40 watts more to achieve around 5% (technical city number) overall performance increase? Even then, some heavier benchmarks favor the 20 series cards.
Which benchmarks show 2000 series doing better than 4000 series?
And you mean like a 2080 vs a 4060 or something? Or like a 2080 vs a 4080?
^ this
and
^ this, when you make claims like these in the OP and above, without substantiating them with anything valid; and then chalk it up to an excuse of "people are going to nitpick whatever so.." your argument just comes off as being made in bad faith.
If you were specifically referring to rasterization performance and comparing from a 10-series to a 20-series making this claim / complaint then you'd have some level of rational because that move was shifting die area toward adding additional fixed function units for RT and Tensor acceleration. An RTX 2080 had roughly the same rasterization performance of a 1080 Ti. But making these claims from a 20-series to a 30-series, or even the 40-series is just flat out ignorant. The rasterization performance from the 20-series to the 30-series was a fairly substantial jump with most being nearly a model-class increase.
Wether you like it or not, the current near-ish term future of graphics rendering is Ray Tracing / Full Path Tracing, and leveraging AI based features to increase the perceived resolution, image quality, and frame rate are part of that future. Specifically in regards to NVIDIA, those aren't just "software features"; they are reliant on those additional fixed function hardware introduced in the 20-series and have been gen-on-gen improved with each subsequent GPU generation. This is one of the primary reasons why DLSS2 looks and performs far and away better than FSR2; and why from what we've seen so far, DLSS3 looks substantially better than FSR3. Not to mention the DLSS3.5 improvement that unifies the denoisers and makes RT look substantially better with "more correct" lighting and reflections while also slightly increasing performance.
This really is no different from when things transitioned to T&L and then as new hardware advancements were made, transitioned to rasterization. The current "Ray Tracing" and the AI features are an enablement for transitioning to fully path traced renderers. The AI features are going to continue to improve their ability to make image quality better than what is achieved by raw brute-force throw more hardware at the current rendering concepts.
Bought a Nord Lead 4 instead 😐
Important to notice, FPS counter is not an absolute number.
The most palpable and doable thing for us gamers is to get stubborn at not purchasing overpriced items from those manufacturers. Trust me, we can afford to game on older hardware but they can't afford plummeting sales as a manufacturer.
At worst, we would still have numerous pretty games for entertainment (games like RDR2, D2R, DR2.0, FH4 or WRC Generations can be easily played with cards even from Maxwell architecture) but manufacturers can't afford it if their gpus produced for gaming market not getting sold.
I have played NFS: The Run which is an 2011 game, and then NFS: Rivals, from 2013. The jump in quality from The Run to Rivals was bigger than the jump from Rivals to Unbound (if there is any). I am not joking. At best quality settings, that 2013 EA game NFS: Rivals looks awesome. And those graphics can be achieved easily with a 970/1060 range card. That's why i don't want to pay ridiculous prices for 8GB cards now.
To a point it's arbitrary what the names and numbers are, so it wouldn't be so bad if the price for performance uplifts were bigger since that matters more, but even those feel pretty bad.
Still hoping next generation shakes out a bit better and this "complimentary instead of replacement" generation is partially just the result of the oversupply issue from the mining boom, even if next generation is likely a ways off. If that is also a bad uplift, that will mean people will basically just stick to what they have until it actually fails since it will almost never be worth upgrading anymore, and games will more be forced to stagnate as the average level of performance does as well.
That's diminishing returns for you. Makes the stagnation possibility that much worse if it happens.
For me, it is not about performance anymore.
It is all about efficiency.
Performance per Watt is where it is at.
This is not a race.
It is a marathon.