Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
i have hardware, that is enough for me, but why is what you said a problem?
thank you for reading.
:-D
From a a 4070 and under, sure
From a 4080/90, I won't be unless they release a 24gb 5080Ti
It all depends on what your trying to accomplish and whether or not you have the money to burn
Everyone's opinions will be different
Speaking of raster of course.
AMD is still the best value and that card is soon to be last generation, too.
FSR can be good at 4K but it can also be absolutely atrocious. I've seen games where FSR native (just for anti-aliasing and not upscaling) made image worse.
I hope FSR4 will be absolutely amazing or I may switch back to Nvidia.
I can't believe how easily some are influenced just by marketing arguments and producers "artificial benchmarks".
I have an RTX 3080 I play all games maxed out with reshade and lut on top it.
Don't need more, don't need "better" because....................it just works perfectly!
This is like when smartphone addicts need to follow hype waves and buy 1500 bucks smartphones every 2 years.......while others with a brain use a 300 buck smartphone during 5 years and aren't less happier...
I recently played FF16, Jedi Survivor and Monster Hunter Wild beta that used over 12GB at 4K+FSR (upscaled 1440p).
Jedi Survivor at native 4K can use around 20GB and it does look good.
as the 5090 its up to 30% at times in 4k if you own a good 3rd party 4090 the 5090
really is not that much of an upgrade if you dont its a pretty sweet upgrade im waiting
on the 60 series if i didnt own a beast of a 4090 id jump on the 5090.
No they didn't, if you bought said cards for 4k, you screwed yourselves, not to mention they are 4 years old now and you are looking at new games, though, only a few.
Want to play at 4k with ridiculous texture resoloution? Pay for it, it's that simple, I have zero sympathy for those who bought under powered cards for their use case.
Heck, just buy a new card, if you cannot budget for a new card over 2 to 4 years, you have far bigger problems than vram usage!
Why are you even trying to push any form of 4k on those cards? Plus upscaling uses more memory.
It's time people take responsibility for their actions.
8GB is fine to play 99.999% of games at sensible settings fir the hardware with 8GB of vram.
I don't know, 30-40% is substantial.
Sadly, it looks like I won't be getting one, huge storm stripped where I'm living of both power, Internet and mobile services for atleast the next week, so guess I'm waiting on restock in a month lol.
Also, it's, really freaking cold lol
I’ve seen plenty of disappointed players who can’t run some newer games due to VRAM limitations. They did pay extra for their GPUs and got less memory. Even if Nvidia made a mistake with 3060 12GB it still shows that they could make affordable cards with sufficient memory.
Maybe if more players had 12GB on PC then devs wouldn’t have to de-optimise ps5 ports made for 12,5GB at upscaled 1080p for 8GB at native 1440p. Maybe we wouldn’t have such high CPU requirements for data streaming and every second game with traversal stutters.
Just because most games run on 8GB it doesn’t mean it’s the optimal hardware configuration. Same as Nintendo Switch able to run 100% of Switch games is not an optimal gaming hardware when devs have no choice but to butcher their games.