Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
I also don’t understand your argument regarding AMD's involvement in consoles. What does that change?
Why is developers' greed considered bad, but Nvidia’s greed justified? Do you think Nvidia charges a fair price for VRAM, and is there no benefit to games being able to use more? How is AMD not any better regarding VRAM? It’s their main selling point, isn’t it?
What does NVIDIA have to do with consoles? You're the one who brought console into it with the PS5 argument, and kept mentioning NVIDIA by name even though that isn't their market.
It's not about developers' greed, it's corporate greed and impatience in not paying for actually good development work. Games shouldn't have to use more VRAM if they don't NEED to use more, how difficult is that to grasp?
AMD keeps their low end at 8GB for the most part while also chiding NVIDIA for doing exactly that. They threw 16GB at the 7600-XT merely to compete with the 4060 Ti because on paper, 16GB looks good to the inexperienced eye, but both cards can't even make proper use of their massive VRAM buffer. With the previous generation, they used more VRAM as a way to entice people because higher numbers look better to buyers, and they didn't have much else to offer initially because the performance was worse than it is now, because the magic of AMD's fine wine drivers. AMD panders when they can't win just so they can make sales, but there's already evidence of that "good will' starting to dissipate because they're just reciprocating what NVIDIA's doing. Someone from AMD also claimed previously that 8GB wasn't enough for modern games, and basically chided NVIDIA for still making 8GB cards, but here we are, they're still making 8GB cards, so they rescinded that comment and probably punished the employee that was being stupid.
Now, I see your point. I mentioned consoles because the vast majority of games are optimized for consoles first and then ported to PC. Having less VRAM than consoles worsens the situation.
Blaming developers for optimization won't make the 3070Ti 8GB any better. I don’t need to mention how poorly the 1060 3GB has aged.
AMD is charging $329 for their 16GB card and will likely reduce it to $299 soon. The 4060Ti 16GB MSRP was $499, the same price as the 7800XT 16GB. NVIDIA charged an extra $100 for the additional 8GB when it should have been $60. Better yet, it should be $30 for an extra 4GB. That is really my point: NVIDIA should add 4GB to most of their GPUs for an extra $30, or at least offer such an option. Or do it as a nice gesture when buying a $600-$1200 GPU.
More VRAM could be easily used for better textures. Textures are often more important than any other graphical setting while using VRAM instead of more expensive computing power. That's why consoles have more VRAM instead of more teraflops - it's more cost efficient.
Doubling the VRAM on the 3070 Ti wouldn't really make it any better either, it's still the same core performance. Yeah you can run some more settings, but that's not what the games should be about, period. It won't change the core experience just because you turned up the effects for a small difference, and the novelty wears off eventually, especially if the performance is still too low to run those settings at a framerate the end user finds acceptable.
There's a reason why people like indie devs more these days, it's because they're actually trying to make good games, many AAA dev companies are more concerned with how the game looks rather than the actual entertainment value, or they just make the same game over and over and over again. Boring.
AMD does things a bit differently because they're clawing for market share still, they have to try to get attention from buyers just the same as indie devs trying to make sales on their games. They'll become more like NVIDIA once their situation changes and they fight more evenly with them on the market.
But the only thing AMD has to use to gain that, is through pandering as they can't win in terms of what they have to offer technologically, because for everything they have, NVIDIA usually has something functionally better already. All they have to work with is price/performance value, that's it, once that changes, watch at how similar to NVIDIA and Intel they're going to become. Corporations are all the same.
I miss the days when people actually gave a crap about the actual gameplay, nowadays it seems like everyone's just chasing the next game because of how it looks rather than how much entertainment value it brings. There aren't many games that make you want to play it endlessly anymore, people play the classics still, but there are going to be very few games within the last few years that people are still going to be revisiting later down the line because they didn't have much to offer to begin with.
Look, if you think that the 4070Ti or 3070Ti have an adequate amount of VRAM for their performance, then we simply disagree. I get your point that many games focus too much on graphics and not enough on gameplay, but I don't see why both can't go together. I like Capcom games that look great, Tekken at 4K, or The Last Of Us. I also want to play the new Final Fantasy. Not every AAA game is bad.
The industry has to be able to work with everyone if they want to maximise profit, that's why requirements for most games aren't completely ridiculous already
The PS4 had 8GB shared between the CPU and GPU and was able to do impressive things with it. Literally doesn't matter what it has as long as the developers properly optimise it, you can have good graphics with 8GB dedicated to the GPU alone.
The point of the PS4 being 'impressive' for its time, despite its low shared memory capacity, was in terms of graphics anyway, not performance. :l
The PS5 offers 12GB for games, and yet we find ourselves having similar discussions about why less should be sufficient on PC.
I am personally surprised at how strongly we defend the idea that low VRAM is enough, when increasing VRAM is arguably one of the most cost effective ways to improve graphics.
Developers can just do their freaking job and produce a better game, that's what they're supposed to be paid to do, not just make a bunch of boring games that have slightly better graphics than the last generation.
I've used GPUs with less than 12GB VRAM for years and only had issues with capacity once and that was in Cyberpunk 2077 when I was experimenting with max settings at 1440p to see how well the RTX 3080 10G would fare. For years, never saw more than 8GB usage and I had my hands on an RX 580, 1070 Ti, 5700-XT, 2080, and a 2080 Ti, the average usage I was seeing was more in the 4~6GB range. 8GB being supposedly not enough is a fairly new concept that applies only to a few games that had a higher emphasis on graphics quality, which is not the point of gaming. Gripes about graphical quality are a rather new thing as well, youths have completely lost sight of what's actually important in the gaming scene. Graphics are temporary, good gameplay lasts forever, people are still playing older games that looked like ass but in a few years, most people will completely forget about these games that emphasized so much on graphics that they lacked in the actual gameplay and had low entertainment value.