Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Still - this generation sucks.
But frame generation is only on a handful of games? RT is as well...
Sure, frame generation is cool but there are maybe one or two games that a single person will play. When more games come out AMD will have their own FSR3.0 already.
New AMD cards can do RayTracing. Unless you want path tracing in cuberpunk then you may want 4090.
Well FSR2 is inferior to DLSS. I am not expecting miracles from FSR3. Also AMDs frame generation will be an interpolation, so it will be inferior as well.
RT OD is playable (50ish fps range) in 1080p on 4070, btw
It’s a trade off.
I personally like to play older games in 4K and pure performance of AMD cards that I can use in every game suits me better currently.
5070 with performance of 4090 may change my stance.
If one is not interested in 4K gaming, than 12GB of VRAM will be perfectly enough.
More than a year until 5070.
You can’t be sure what will be enough. Some games already get close to 12GB at 1440p and devs just started to move on from 8GB.
I bet some games at 1440p Ultra + RT will exceed that limit in 2y. And if you don’t care about RT then why pay extra for nvidia? AMD already can do better RT in some games due to VRAM limitations.
I also start to believe that Nvidia actually did plan obsolescence considering that 3080 was planned to have 20GB and some 3060s actually have 12GB.
If that’s true it’s also a matter of principle to not buy their powerful but castrated products.
Additionally I have very niche problem with NVIDIA. Their current cards have only one HDMI slot. I run dual 4K TV setup and need two HDMI ports. Only Asus cards offer them but costs a lot more. I refuse to be scammed like that. It’s super weird limitation considering rising popularity of 4K TVs as monitors.
Devs will not move from 8GB for a long time. Considering that majority of players out there have either 6 or 8GB of VRAM.
This poor port is not an indicator of devs moving anywhere.
77% of players, to be exact. No sane dev team is going to cut themselves off from 77% of the market right off the bat.
This will change, but we're not near that yet.
And games like FARCRY 6 will just tell you that your system does not support Ultra HD (or whatever they call it) textures if you have 8GB GPU. It's fine in my book to not be able to run all on ultra if you don't have the top hardware, but mid range should not be limitted to the very bad low quality textures like in this game (Medium setting that fits 8GB)...