Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
multiplatform games are wrong we should have pc only games.
but if you make them platform than when porting them to pc alter how theu use menory.. do not use vram like consoles shared ram.. use that ddr.!
and ram amount on cards is just fine
900 series in 2015
low end 960 2gb
mid end 970 & 980 : 4gb
high end 980ti : 6gb
extreme : titan x : 12gb.
if we presume upgrading very 2 generations
2000 series in 2019
low end 2060 : 6gb
mid end 2060s, 2070, 2080 : 8gb
high end 2080ti : 11gb
ultra : titan rtx : 24gb
a nice doubling of vram
4000 series in 2023
low end 4060 8gb
mid end 4060ti to 4080s : 12 or 16gb
ultra : 3090 : 24gb
still pretty much double.. but 4080s at 16gb should have gotten 24gb.. and 4090.... 48gb..
all other models have a perfectly fine amount for their class.
so its earlier in the highend nvidea has been ripping off with to little ram.
a gpu with about half of what is system ram, has always been kinda a mark
8g gpu with 16g system ram, was a good mix
now, 12-16g gpu with 24-32g system ram is still good mix
Why would you prefer lower quality textures and more stutters in an already overpriced GPU?
I think gamers are starting to forget what games are actually about... you know, the gameplay?
I've been using a 3080 10GB for the last several years and you don't hear me complaining about the 10GB buffer because I haven't had any issues with it. For most of that time I watched VRAM usage and almost never saw it actually go above 8GB, only game I noticed it did and was also able to use 100% of the 10GB was Cyberpunk at 1440p without DLSS and settings cranked because I was comparing DLSS on vs off.
And what you fail to understand is that even if the 3070 had 12+ GB, it would still be too weak to handle those games at the same level detail as the 6800 because it's almost up to 30% slower than it regardless, that's why it performs so much better. The VRAM would've been wasted on it because it's just a weak card, people are still mad about the 3070 but it still wouldn't have made a positive difference to actual performance in frames per second, just texture compression.
When game developers focus too much on higher end hardware every time something new comes out, they end up leaving out the poorest demographics in gaming when they could just do their job properly and optimise their games properly so everyone can enjoy them, not just the people who are buying GPUs with 16GB+ VRAM and overkill GPU cores. Video games are about the GAME, not flashier graphics every 2 years.
And it's not like they're mustache-twirling cartoon villains trying to defraud us, it's just that there's such an absurd amount of money sloshing around in the AI space right now that they'd be stupid to not squeeze that bubble for all it's worth.
Sadly no video output, only for AI Data Centers. It's $37,000. lol
They paid as much as they did with the expectation and guarantee that it WILL run every game at the highest resolution. And WITHOUT needing to rely on crutch tech like FG just to get there. I don't remember anyone mocking Titan, 1080 Ti, or 2080 Ti users for expecting the best performance for having the best cards. That's why they ♥♥♥♥♥♥♥ paid $1K-$2K for. It wasn't just so they can jerk off to having the best GPU. It's no surprise then that even mid-range cards aren't delivering on what they cost these days in relation to their performance.
but yeah many newer games use more than 8GB vrarm even at 1080p ultra settings
this
especially laptops
paying $2000-3000 for a laptop with RTX 4070 that only has 8GB Vram is laughable
while the 4050 only has 6gb
but I guess they are still better than Apple...
Again no, not at all. Thats a concept you made up.
This is one of those moments where you see what you think is a pattern and create a reason in your head.
32MB of VRAM was common for people who had 384MB of system RAM. Even 64MB of VRAM.
128MB of VRAM was also common for people with 384/512MB of RAM. (I even ran such a setup for years).
People were running 256MB VRAM cards with 1GB~2GB of RAM.
By the time people were running 1GB+ VRAM cards they were already running 4GB~8GB of RAM.
And this isn't even counting the 320MB, 640MB, 768MB, 896MB, 1280MB, 1536MB, VRAM GPUs. At no point in time were people ever running double those amounts for system RAM.
There is no correlation or optimal ratio between the two, period. You can even run with MORE VRAM than system RAM (which people already do and have for quite some time) and still get the benefit as they aren't related.
And as for consoles, again no. No such ratio between the two.
For most consoles the trend has been one shared pool of memory, and if not it was never a 2:1.
The PS1 had 1MB RAM and 1MB VRAM. Thats a 1:1 ratio.
The PS2 had 4MB of main VRAM and 32MB of RAM. Thats 8:1.
The PS3 had 256MB and 256MB. Thats 1:1 again.
The PS4/4 pro/5/5 pro all use a single RAM pool.
The Xbox had 1 RAM pool as did the 360 and the Xbone.
The XBSX and XBSS are the only asymmetric configured Xboxs to exist with BIGGER pools of faster RAM used for VRAM than slower RAM used as system RAM.
The N64 had a single pool of memory.
The GameCube has 3 pools of memory but to sum it up the faster pools make up most of the memory again being opposite of your ratio rule claim.
The Wii again with a weird configuration was about a 3:1.
The WiiU had two pools 1GB for system junk and 1GB for games in a unified manner.
The Switch has a 4GB pool of memory.
This trend pretty much continues.
VRAM amounts have NOTHING to do with system RAM amounts. There is no ratio. Period. Its not a thing.
The only things that dictate how much VRAM a card should have is what do modern games need (and for the foreseeable future), and what resolutions people play at. It has NOTHING to do with a ratio.
Even 1080p needs 12GB+ 1440p needs 12GB~16GB. None of this increases the need for system RAM, you can literally even have a 20GB card playing at max settings 4k 240hz and not be held back AT ALL by using 16GB of system RAM.