Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
it is kinda the devs fault for not optimizing the games for pc
not using the system ram/vram effectively as they could
but instead making the game run good on console and assume an over specd' pc can do the same tricks
PC moves data to system RAM first, decompress and then holds it there until it needs to be moved to VRAM.
System RAM is too slow for graphics what is very noticable when you run out of vram or on computers with shared memory.
When PC has shared memory it still dedicates some to the GPU like some sort of a partition and needs to copy the same data from CPU portion to the GPU dedicated part of memory. It’s so extremely inefficient.
Consoles don't need system RAM as data can be copied directly from fast nvme drive thanks to DirectStorage, unified memory and dedicated hardware decompression chip. So consoles can use nvme as RAM in a way.
PC has Direct Storage but without hardware decompression chip and no unified memory.
Unified means that CPU and GPU can access the same data on the same memory without dividing into CPU and GPU part. Very different to shared memory. Not to mention the speed difference between DDR5 and GDDR6.
Anyway…
3080 10GB having less memory than 3060 12GB is absurd.
4070Ti having as much memory as low tier and generation older 3060 is unacceptable.
VRAM is not made of gold and is not that expensive.
Personally I only like gpus with 2 digits for the amount of vram. I can easily use 10+GBs for VRAM. So I'd prefer 16 or more, bit you don't really need more more more for gpus compared to system ram, though it's he mice is units above the 50 for a series, xx60 and higher would come with 12 or higher.
Units 8GB and below are usually mid gaming or less, 1/2/4 are usually for basic use maybe light games.
So its for the job/performance, price etc. Though I would prefer no less than 16GBs for xx70 and higher lineups.
worse thing is a lot of them are 4k capable they actively get smuthered likely to protect like the 4080
i just went with amd experience good just a few games that would have worked better on nvidia if i wanted raytracing but the 6950 xt does its job
now i am building a 8700g system for a low powered second pc allready saw someone play it 30-40 fps with fsr balanced while the comparitive gpu the 1650 bits the dust being heavily vram limited in stalker 2 running under 10 fps at fsr performance
the 780m in the 8700g can allocate up to 16 gb for vram its a 32 gb build so you can guess that ill fully utilize the 16 XD
the worst news thought is that nvidia is artificially inflating the gpu market again with a so called gpu shortage for the launch of the 50 series and there wil be a price hike again
8GB in the low end is fine, extra VRAM will just jack up the price even more, and the cards usually don't perform well enough in many cases to even properly take advantage of the excess VRAM to begin with, the 7600-XT and 4060 Ti are both plagued with that.
The complaints are rooted in the fact that AMD offers 16GB in the same price area as the 4070, but that's because they don't have a choice. They're competing, and AMD doesn't have anything to lean on, they either deliver a good product or they crash. If nVidia had offered 16GB with the standard 4070 then nobody would be complaining.
Trying to equate VRAM to system RAM. Theres no related ratio what so ever, thats not how this works.
Halo products never break 1~2% population. Not sure why people have brain worms and pretend everyone is out buying 4090TIs.
Releasing *80 tier and below cards is more than enough especially considering people mostly buy in the *60 tier.
Wow, just. WOW.
Yeah, no duh people wouldn't be complaining about VRAM on the 4070 if it had an appropriate amount. Thats literally the issue.
Second, you claim AMD is doing the same thing but no, they aren't. You have to go all the way down to the 7600 to find 8GB of VRAM and thats actually a card nobody should buy regardless of VRAM.
Everything above has 16GB or more of VRAM. No 8GB at the price points Nvidia sells at is not fine. Comparing the 8GB vs 16GB models already shows you lose 34% performance when you only have 8GB of VRAM at 1080p.
Trying to say these cards can't benefit from more VRAM, or that it'll be pricier, or whatever nonsense is full blown COPE and nothing more.
Its just as dumb as the clown show that is recommending the 4070 over the 7800xt when the 7800xt is flatout a better card.
The 7800xt released for $100 cheaper, has 50% more VRAM, beats the 4070 handedly in raster, and goes toe to in RT.
But I'm sure you'll find some illogical reason as to why the better card is actually worse.
I spent 20 years buying from Nvidia and ironically they chose to offer the worst competition in cards around the same time AMD got their ♥♥♥♥ together so I dipped.
Intel didn't invest in R&D and sat on their lorals and AMD took their sandwich so I dipped.
If AMD does a ♥♥♥♥♥ wucky I'll drop them too but as of right now Nvidia offers less VRAM, less raster, and worse Linux support for more money. Why would I buy from them?
> Nobody should be buying the 7600 regardless of VRAM
>> but people mostly buy x60 tier
Watching your mind at work is entertaining. Most people can only afford to buy the lower end tier and somehow they're wrong for that.
And yes AMD does do the same thing, the 7700-XT is a x70 tier card with 12GB yet the 7600-XT is 16GB... because they mirrored nVidia to cut costs on the 7600 and 7700-XT and compete with the 4060 Ti 16GB.
Gotta love how people get mad at nVidia for things but then completely overlook that AMD does the same things.
Wrong, GamersNexus' testing easily concluded that the 7600-XT and 4060 Ti 16GB were unable to actually give great performance when running as much of their VRAM capacity as possible because they're too weak for that VRAM buffer.
MOAR VRAM!! does not mean MOAR BETTER!! when the core is too weak to keep up in the first place.
So, math isn't your strong suit.
*60 != *600 but thanks for the clown show.
Second, the 7600 non XT exists as a 4050 tier card NOT a 4060 tier card.
I also love how your "AmD dOeS tHe SaMe ThInG!!!" is AMD starting their lower mid end with 16GB and 12GB while Nvidias is 8GB.
Sorry, not the same thing.
Also, you kids have got to STOP with the bizarre belief that a card somehow must have some magical speed property to not be held back by VRAM limits.
VRAM is a resource pool, if you don't have enough you hit system RAM and your performance tanks.
https://www.youtube.com/watch?v=nrpzzMcaE5k
And this isn't even the only bench marks of this issue here.
Thanks for fanboying SO HARD and not being bothered to even google.
The VRAM difference is literally games at 4k vs 1080p for a lot of games.
Games like rachet and clank even at 1080p lose 34% performance with 8GB cards.
Its literally a case of buy from AMD or spend more do less with Nvidia.
Still, the additional 8GB of VRAM makes little difference. 10% max
3060 has 12GB and it's a good amount. 3070 has 8 and can't even run some games at 1080p or games have missing textures.
Prices according to pc part picker US:
3060 12GB - $265
3070 8GB - $430
4060 8GB - $284
4060Ti 8GB - $360
4060Ti 16GB - $430
4070 12GB - $490
4070 S 12GB - $590
4070Ti 12GB - $690
4070Ti S 12GB - $740
4080 (super) 16GB - $950
6600 8GB- $189
7600 8GB - $230
7600XT 16GB - $280
6700XT 12GB - $270 - (nice)
6750XT 12GB $290 - (nice)
7700XT 12GB - $370
7800XT 16GB - $440
7900GRE 16GB - $530
7900XT 20GB - $620
7900XTX 24GB - $830
After 8GB version it was not possible to give 12GB as there were no 3GB modules at the time and you had to double to 16GB.
Also there are already games using 12GB at 1080p.
Many of them did the smart thing and hide VRAM limitation by secretly not giving you good textures ignoring your textures settings.
Here is a difference between 8 and 16GB.
https://www.youtube.com/watch?v=Rh7kFgHe21k
Why would anybody defend low VRAM when it's arguably the cheapest way to improve performance, visuals, stutters and lower development/optimisation time for devs? Games would feel so much better optimised if GPU had more vram. One of the reason why games can stutter is when they keep decompressing, loading and unloading data within small memory pool.