Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
8 GB VRAM is sufficient for 1080p high-settings gaming. Which these GPUs are aimed at.
You will probably NEVER use the extra VRAM of 3060 unless you play some specific game at Max settings with Ultra texture, which contribute very little to the actual visual quality anyway. But you will be definitely using the extra 15-20% power of 4060 everyday, especially in these low-end GPUs.
You also need to assess your power supply and monitor connector type against the specs of any intended purchase.
By time all games coming out need more than 8, basically every GPU out right now will be too weak to get good framerates regardless.
That’s a very strong opinion.
I personally can’t recommend a 8GB GPU for people who want to use the card for the next 4 or 6 years.
Having a brand new, expensive PC that can’t run console games just doesn’t feel right.
There is no long-term value in any lower end GPU when the core won't be keeping up, VRAM capacity won't save it.
And there are other factors at play;
1. RTX 4060 only draws around 115W and doesn't get very hot, making it very viable for systems with older/weaker power supplies, as well as smaller form factor systems where heat can be more of an issue to resolve.
Compared to the RTX 3060, that's around a 30% reduction in power consumption with a 15~20% increase in performance.
2. Compared to the 6700-XT which performs around 10% faster in rasterisation performance, the 4060 is similarly faster in raytracing performance and carries full support for DLSS 3, and because NVIDIA already had Reflex, DLSS frame generation is better than the AMD equivalent, and thanks to AMD, NVIDIA GPUs can use either DLSS or FSR at the user's leisure, whereas you're stuck with FSR, or nothing if FSR isn't an option but DLSS is in another game. Both are common now.
If you want a GPU that'll actually last 5+ years and still perform very well, you're basically looking at a 7900-XT, XTX, 4080, and 4090. Even the 4070 and 7800-XT are not going to hold up that well in 5 years from now, that's 3080 level performance which I'm quite familiar with, because I use one. The industry doesn't want you to keep your GPU for as long as possible, they want you to upgrade, because it's a business. The farther they push things forward into higher graphics quality, the greater of an issue that's going to become for people on the low end who can't afford to pay $500+ for a GPU every few years, and it'll happen regardless of how much VRAM is supplied. But forced obsolescence is obviously okay with gamers because muh graficks.
https://youtu.be/Lcb1dpe2IhE?si=4Nw1AHckR655X4-L
Everybody will make their own decision, but recommending an 8GB GPU today to play new AAA games even at 1080p is too risky, in my opinion. Textures alone make as much of a difference as other settings combined, while they don’t require any computing power - just memory. Reducing texture quality is the worst compromise to make.
The low end is full of compromise and both AMD and NVIDIA are intentionally holding it down.
You can easily avoid that just by not cranking every little setting and trying to run 8K textures on everything
There's a big difference in performance between the 4060Ti 8GB and 16GB at 1080p in "The Last of Us," and horrible textures in "Halo Infinite" even at 1080p. Some other games like "Resident Evil" or "Forspoken" (not shown in this video) do the same and load worse textures to avoid stutters. There's up to a 10% difference in 1% lows in other games, suggesting more stutters with less VRAM. 1% lows are more important to many people than the average number.
The massive difference is in "Plague Tale: Requiem" with RT on. The game runs at 60FPS with 16GB but at 40 with horrible stutters on 8GB - drops to 18fps.
Fun fact: 1440p with DLSS quality is usually easier to run than native 1080p but needs a little bit more VRAM. Ray Tracing and frame generation also need more VRAM. Some mods need VRAM as well. Playing for longer than 10 minutes can consume more VRAM, so not every benchmark will show the problem.
All this is today. What about future games? Devs barely stopped making cross-gen games on PS4 with 8GB VRAM. If you think 8GB is enough, then it's absolutely fine. It's fine for most people and most games.
The question is: Is 8GB a good recommendation for people who want to play popular new games for the next 4-6 years?
If the card doesn't meet the user's needs then they shouldn't buy it, doesn't mean the card's bad just because it doesn't have all the VRAM.