Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
theres is forward rendering and deferred rendering if you wanna google that. forward rendering usually uses a single framebuffer. deferred rendering gbuffers use multiple buffers that are the size of your render resolution and additional buffers to compute the lighting and shadows and reflections and all the things. that is the main consumption difference.
in general, the higher your resolution the more vram you need for the framebuffers or gbuffers, but also the more pixels the gpu has to compute. you gotta keep that in mind. the gpu gotta compute all of it and may not have enough power to do so in time to reach consistent frame times or fps in general.
so the vram is half of the "problem". what's left of the vram for those framebuffers or gbuffers is what is you can use for textures. on 8 gigabyte cards you may not fit high textures into that space if the resolution of those buffers is too high. you gotta balance that.
true
i only have 6gb and i can run the game fine with just under 10 gb in the red, it doesnt seem to matter too much
Literally yes
And also go check out how well Hogwart's is running. Or any of like a dozen other major newer games like even the latest Horizon Zero Dawn was having troubles iirc chewing through over 11gb of VRAM, and that's not counting older games like Doom Eternal that was using up to 11gb back in March 2019 when it released. So this is why I told people even all the way back then if already we are having problems with running a year old game then there's no way we're having 8gb being adequate enough VRAM for running future titles, and well scalping and mining and the whole market got broke so it seems like people just stopped giving a damn ebcause they were happy to have literally ANYTHING that can game.
But fast forward about 2-3 years and you'll see everyone was right, the main thing that's shocking being how poorly RTX 3070 or 3070ti can do raytracing compared to the RX 6800 just because of its crippling VRAM limitations. So what that makes it is hilariously enough you should've bought the RX 6800 16gb if you wanted to enable full raytracing. 3070ti was just a bad product. It launched with way too high a pricetag and that's what made it not worth to me. I was waiting on the card announcement too and when I heard the MSRP and TDP I subsequently lost all interest or in hoping it would save the 30 series line. Instead it of being the foretold one to save Ampere, 3070ti ended up being the defining card for what sucked about Ampere and why 30 series failed.
It can use as much VRAM as it likes, as long as game doesn't look sh*t on the GPUs with 8GB of VRAM (that's the most widespread amount of VRAM on Steam followed by 6GB).
tbh i dunno how you manage that. i tested and maxed out yesternite. i can't get my card above the 6 gig limit in the options. (discarding the osapp stuff). say... i have high environment. if i raise the character textures to high it automatically set the environment to medium to stay within budget. how you run this on high in the red? and with 16 gigs of ram? sus at this point.
I hear so much stupid ♥♥♥♥ on gaming and tech circles sometimes, like people that are insisting you get a 2060 6gb to "futureproof" because they didn't know how DLSS worked and thought it was going to make native resolution "obsolete" or that people insisting "30 series is better for 'futureproofing' because muh RT" obviously not realizing how much a crippling disadvantage their gimped VRAM was obviously going to turn out to be and that, worse some may argue, that gimped VRAM was effectively going to make it the worse raytracing card too. Not that it would matter, because by the time the next gen launches you're already having considerably worse RT than newer AMD anyway. Like a 7900XT just smokes it hands down. But now on top of that you have the ridiculous texture loading problems and well let me put it this way, I love the GTX 980 and Maxwell and Pascal series cards, and I still have to make the concession that The Division looks so much better on AMD hardware partly just because benchmarking the Division on a 980 causes bad texture popins from the limited VRAM.
nVidia users are just used to suffering this really, texture popin in and not having good GPU software is such an accepted fact of life that if you use nVidia literally the first thing you install on your system after windows is Afterburner, because nVidia's software is so terrible, is such. ♥♥♥♥. that you reflexively use 3rd party software than even bother with it. I mean I guess it's not too atrocious, I'm used to Windows XP/98 type stuff so it doesn't bother me as much, but it's just so much nicer to use Radeon software than having to rely on stupid control panel vs geforce experience vs afterburner and Riva tuner type problems. I think the average nVidia user kind of just rationalizes after the fact like an alcoholic why it was not in fact wasted money, but I always feel like I'm wasting money on nVidia longterm. this does not justify the RX 550 or the RX 6500XT though, those two cards sucked hard, and the Radeon VII was too overpriced although I'll admit despite how it sucked because of cost and other problems it turned out to make a really great mining card and a good gaming card so whatever nVidia largely floats itself on the misbegotten idea that "more expensive=better" and pouring all their money into astroturfing and the marketing division, good example being ♥♥♥♥♥♥ stuff like userbenchmarks hilarious cope replete with lies and the nonstop spamming on a certain cancerous site's equally cancerous technolo/g/y board, where drones get paid to ♥♥♥♥♥♥♥♥ about Chrome and nVidia 24/7 despite I can't think of any techie who actually likes and trusts Google ♥♥♥♥. The correct answer is Firefox with noscript and adblockers btw, preferably with duckduckgo.
The second option being you are somehow expecting this for not just high end monitors but also running raytracing and lad, I know what it means to be coming from the darkness of ignorance
The problem is it literally is nVidia's bad drivers and nVidia's gimped VRAM. That is why it has been affecting 30 series owners so much more strongly than everybody else, because first of all nVidia's RTX 3000 cards had buggy drivers from the start--no it's not just AMD, nVidia is also well known for having bad drivers and all their RTX cards had some sort of driver problems even the RTX 2080ti had driver problems--and evidently they still are having weird driver problems for some reason, and secondly it is because they crippled their cards' vram on purpose to make you replace them faster as they age like milk.
It's already been two years and Jensen and the nV board of execs is sitting there on their yachts like "didn't you peasant gamers get the message? It's been two years already and I want a new Bugatti for my collection, chop chop wagie get back in your cagie and make us some moolah." They're acting like why is this paypig not buying a new GPU already clearly he had two years of fun at 8gb and time's up. Now you see also why some of us got so incensed we finally switched brands or saw Ampere as really bad value. I mainly feel bad for the parents that just wanted to do something nice for their kids and got completely ripped off, first by the megacorp, and then by the scalpers. I'm even angrier at the used office PC with GT710 sellers though. So many people just got robbed by the Chinese from that and the fake 1050tisif it ships from China do not buy it, no amount of alleged savings will help how much they are scamming you
Does every post of yours have to be an essay where 80% of it is you repeating how Nvidia 30 series users are stupid basically? You are repeating yourself over and over again ffs.
And you didn't get his point of course, you just parrot same ♥♥♥♥ over and over again. You said:
"I don't give two ♥♥♥♥♥ what Steam hardware survey says, there's still people on here using a dualcore and 768p displays, there are still people using HDD boot drives, and according to that survey the most common options are quadcores. That doesn't mean you should expect your potato PC to run everything at very high to ultra settings, period"
He meant VAST MAJORITY of users are on 8gb VRAM, you mentioning Dualcores or whatever makes no sense as a response to that, and with majority having that, Devs would lose tons of profit if they made 8gb obsolete. And no one said they want to run Ultra on 8gb, he literally said games just have to look good on 8gb, basically Medium/High combo or whatever. Some games look really good even on Low, while this game's textures on Medium look muddy like before PS2 era. There is no justification for that no matter how much you want to ♥♥♥♥ on 8gb users.
During game GPU usage is between 60% and 90%, CPU usage is average 50/60% all cores. The game is installed on a very old HDD, so first load is long, after it's OK. XBox One controller.
More or less 6 hours in game. No issue, no crash, no stutters.