Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
If you use a 1080p panel, I wouldn't worry about VRAM that much.
At 1440p I would make sure to have at least 8gb.
Nowadays at 4K I would probably not recommend anything under 12gb, but 16gb is preferable.
Make sure you take into account any monitor upgrades you might make when considering a GPU.
I would get the best card that fits in your budget (within reason). 6gb of VRAM is a good starting point for a card aimed at 1080p.
Also i say 8GB is a good starting point for 1080p, Reason is just the requirements for games slowly creeps upward, once 4GB was enough and once 2GB was etc.
You have got to be kidding me! Have you ever even played games in 4K? There are no games currently that require that much Vram. Even at 4K.
8 to 10GB of Vram is enough currently. Even at 4K.
I think you people see how much Vram these new GPUs have and assume that is how much you need in games.
8GB of Vram is still the sweet spot. And even at 4K, I haven't really seen any games that require much more than 10GB.
But here you go:
MSI GeForce RTX 3060 VENTUS 2X OC
P.S.
All open world games needs alot VRAM.
If you just playing Heroes of might and magic type of game... then drop it..
GL
I haven't actually played games at 4K and was going by what I've heard, and from my own experience. I use a 1440p panel and have an RTX 3080 with 10gb vram and would not want to run a card with anything lower at that resolution. For example, in Resident Evil 7 and 2 remake, if you max out the texture quality, you will notice microstutter with an 8gb card. I used to run a 2070 super at 1440p and it was not a pleasant experience in this scenario.
Also, please could you recommend a 4K card that has 8gb Vram? If OP were to buy a card with 8gb that would be a mistake if he was using a 4K panel (if we are assuming he wants to play the latest games).
Maybe I'm also a little biased from my own use cases, such as the fact I often mod games such as Skyrim, even 10gb isn't enough at 1440p in some situations.
I would look at what you want to spend and what level of performance you want, and then from there, consider the other stuff like VRAM. VRAM does matter, mind you. I'm not saying it doesn't. nVidia is being stingy with it lately so i can see some considerations being made to make a choice for an AMD alternative in some scenarios based on that, but you still don't want to buy a GPU primarily based on VRAM first, and system RAM certainly has nothing to do with how much VRAM you should get.
If that's the minimum / normal amount of Vram from the GPU developers these days, the game developers will surely take this in to consideration when developing their games.
I have actually been using a 3070 Ti with 8GB of Vram and haven't had an issue. My target is 60FPS and I am able to maintain that no problem.
Even in Resident Evil 7 and 2, which I have both. I am able to max out everything, even texture quality, and there are no issues. Not even microstutter. The game will give me a warning that I do not have enough Vram, but I am still able to play without issues. It is just that afterall, a warning. Doesn't affect gameplay.
And even if Vram goes over 8GB in some games, I have not seen any games where it goes over 10GB. Not even at 4K. And even if it goes over 8GB in those games, I do not notice any performance issues. It just dips into system ram after that. But like I said, has not affected performance.
So, there is no need currently for 16GB, or more, of VRam. Even at 4K. Now, I would much rather have a 3080, or better, for 4K. But when I was building a PC in 2021, the 3070 Ti was all that I could get at nearest to MSRP. But the reason I would want a 3080, or better, for 4K, is not because of the added VRam, but because of the overall performance of the GPU.
But like I said, even at 4K, with my 3070 Ti, I have been able to pretty much max out every game I own and maintain 60FPS. I also monitor everything, and I have not seen many games that use much over 8GB of Vram. Let alone 16.
Sure, I'm glad it worked out for you, however in my experience with the 2070 super, I was getting microstutter when maxing out the texture quality in Resident Evil 7. Maybe it's something to do with the fact that the 3070ti has GDDR6X memory while the 2070 super uses GDDR6 memory, or maybe the stutter is not as noticeable at 60fps, I think I was running at higher framerates than this.
For me personally, I would not purchase a card that I intend to use at 4K with anything under 12gb of vram.
Just set your graphic higher in a open world games.
And 4K gaming is bull... There is no graphic card that can calculate with normal FPS without DLSS, (Upscaling).
Simple games as Rocket league and stuff, but a AAA- game. Forget it...
For example
My previous EVGA GTX 1060 SSC 6GB was limited in Middle-Earth Shadow of War, but a 1070 8GB was able to max it out because of the extra 2GB VRAM. So having extra VRAM available can make a difference in certain/most situations.