Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
i wonder why my 6800 can handle games with ease where 3070 just crash (RE4, Last of Us) or wont load textures properly (Hogwarts. Forspoken)
enjoy your butchered products everyone!
4070 also sell well last time i head :D
Also, VRAM doesn't matter as much as you think. Maybe it matters a lot when a developer has poor memory management and can't manage VRAM. But those are bad developers.
Last I checked, software may drive hardware, but it's the job of the ENGINEER to maximize and work with what they have. Games likely come out unoptimized because of developers that use lazy, automated tools instead of actual coding.
If that's the type of game and developer, I won't be missing their games.
There is no must-play game in 2023 - onwards anyway. Just games that try to shake you down for lower quality, and lesser standards than the past.
i wouldnt know i dont care about GPU,s below my RTX 4090. RX 6800XT is pretty much low end GPU it was even weaker than my previous outdated 3080
your problem not the game all games need alot of VRAM now especially hogwarts legacy all maxed out at 4K with Maximum Ray Tracing it even makes the budget 3080Ti with its tiny 12GB VRAM fall to its knees.
Port is absolute terrible. One of the worst PC Ports of all Time. You got a Point tho with the VRAM.
thats what happens when you have outdated PC you play old games for life
Every triple A will start new projects designed to barely work with 8GB.
- They will make you believe that 8GB is 'budget'. A lot of gamers are all too happy to dunk/shame the 8GB crew.
- The games industry is 'creatively' dead - substituting gameplay with ray/path tracing fidelity.
- The GPU industry is all too keen to accomodate the hardware that feeds the triple-A narrative.
People choose to use NVIDIA cos of brand recognition (and for some of us - the card is optimized to work with some NeMo, audio/video AI projects.)For the gamer; AMD (last products have made great alternatives - check all the YouTube reviewers for GPUs)
AMD is behind you pay less you get less only Intel and NVIDIA are the best 💪
AMD cards literally require more VRAM than Nvidia cards to do exactly the same thing.
https://www.youtube.com/watch?v=V2AcoBZplBs&t=382s
But it's only 0.5-2GB extra GB diff(depending to on the setting).
I wouldn't say that it's over. I mean, maybe if you are the type to also like 69.99 - 79.99 games you will enjoy games that will require that much VRAM.
But I predict that most games won't because it's incredibly dumb to cut off a large pool of potential customers.
Also, not everyone plays in 4K. As for raytracing, UE 5 has non-Nvidia implementations that will stress hardware less and look better IMO. UE 5 is what a LOT of games are being built on, and it's being optimized heavily by many graphics experts around the globe and AMD.
You can see UE5 or if you want to test, learn, or help there:
https://gpuopen.com/unreal-engine/
Developers using UE 5 will be able to increase performance with a few clicks. At least that is the goal...
4000 series of nVidia is 12/16/24GB of VRAM. Why buying nVidia? Because DLSS is much superior to FSR.
https://twitter.com/Arokhantos/status/1649504449574961158/photo/1 sums up my thoughts exactly.