The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
slickspidey Apr 11, 2023 @ 11:06am
Besides the unoptimization, is the real enemy vram?
GPU: RTX 3070 8 Gb
Ram: 32 Gb
CPU: Ryzen 7 3700x

Since I only have an 8 Gb gpu, would this be my bottle neck when trying to run games at 1440p at 80-90fps and above? I’ve played games like Spider-Man and god of war but usually have to resort to running them at medium preset to get my fps around 70-80. However, I would like to run games at 90fps and above on high or ultra settings, without the stuttering and fps drops. Does that mean my vram for my gpu is the bottle neck?
< >
Showing 1-15 of 16 comments
papalazarou Apr 11, 2023 @ 11:19am 
im probably gunna get shot down but, i have a 2060 6g 16gb and 8700k, i can get 45-60 on 2560x1440 on mostly ultra and medium shadows, its very playable and runs nice, if i had your system it should be perfect.
episoder Apr 11, 2023 @ 11:20am 
depends on the game engine and how it renders the graphics.

theres is forward rendering and deferred rendering if you wanna google that. forward rendering usually uses a single framebuffer. deferred rendering gbuffers use multiple buffers that are the size of your render resolution and additional buffers to compute the lighting and shadows and reflections and all the things. that is the main consumption difference.

in general, the higher your resolution the more vram you need for the framebuffers or gbuffers, but also the more pixels the gpu has to compute. you gotta keep that in mind. the gpu gotta compute all of it and may not have enough power to do so in time to reach consistent frame times or fps in general.

so the vram is half of the "problem". what's left of the vram for those framebuffers or gbuffers is what is you can use for textures. on 8 gigabyte cards you may not fit high textures into that space if the resolution of those buffers is too high. you gotta balance that.
Last edited by episoder; Apr 11, 2023 @ 11:27am
Bubbas Gumbo Apr 11, 2023 @ 11:21am 
i mean yiou just said it yourself, "i have to resort using the medium preset" in my eyes if you want higher graphics settings just lower your res
Cylokin Apr 11, 2023 @ 11:21am 
Like I recently posted in another thread, I have 3070 GPU and I am playing this game on High/Ultra combination, on 1440p, DLSS-Quality, getting 60+fps. It is indeed unoptimized mess, but some people go way overboard with VRAM doom mongering.
Last edited by Cylokin; Apr 11, 2023 @ 11:22am
episoder Apr 11, 2023 @ 11:24am 
just to give you an idea: i set my output resolution to 900p and use dlss (which decreases the gbuffer size). i set all vram dependent settings to low and i can cram high environment textures into the vram of my 6 gig card. a balancing act and a tight squeeze, but managable if you want to.
Last edited by episoder; Apr 11, 2023 @ 11:29am
papalazarou Apr 11, 2023 @ 11:50am 
Originally posted by Cylokin:
Like I recently posted in another thread, I have 3070 GPU and I am playing this game on High/Ultra combination, on 1440p, DLSS-Quality, getting 60+fps. It is indeed unoptimized mess, but some people go way overboard with VRAM doom mongering.

true

i only have 6gb and i can run the game fine with just under 10 gb in the red, it doesnt seem to matter too much
stage Apr 11, 2023 @ 11:51am 
In a nutshell yes. 8gb is enough for High on 1080p in this game, but definitely no for 1400p+. The card is powerful enough however lacks ram. 3070 with 16gb vram would be rocking for many years to come, but alas...
https://youtu.be/Rh7kFgHe21k?t=340
Literally yes

And also go check out how well Hogwart's is running. Or any of like a dozen other major newer games like even the latest Horizon Zero Dawn was having troubles iirc chewing through over 11gb of VRAM, and that's not counting older games like Doom Eternal that was using up to 11gb back in March 2019 when it released. So this is why I told people even all the way back then if already we are having problems with running a year old game then there's no way we're having 8gb being adequate enough VRAM for running future titles, and well scalping and mining and the whole market got broke so it seems like people just stopped giving a damn ebcause they were happy to have literally ANYTHING that can game.

But fast forward about 2-3 years and you'll see everyone was right, the main thing that's shocking being how poorly RTX 3070 or 3070ti can do raytracing compared to the RX 6800 just because of its crippling VRAM limitations. So what that makes it is hilariously enough you should've bought the RX 6800 16gb if you wanted to enable full raytracing. 3070ti was just a bad product. It launched with way too high a pricetag and that's what made it not worth to me. I was waiting on the card announcement too and when I heard the MSRP and TDP I subsequently lost all interest or in hoping it would save the 30 series line. Instead it of being the foretold one to save Ampere, 3070ti ended up being the defining card for what sucked about Ampere and why 30 series failed.
Vox Maximus Apr 11, 2023 @ 12:07pm 
Originally posted by Red Star, Blood Moon:
https://youtu.be/Rh7kFgHe21k?t=340
Literally yes

And also go check out how well Hogwart's is running. Or any of like a dozen other major newer games like even the latest Horizon Zero Dawn was having troubles iirc chewing through over 11gb of VRAM, and that's not counting older games like Doom Eternal that was using up to 11gb back in March 2019 when it released. So this is why I told people even all the way back then if already we are having problems with running a year old game then there's no way we're having 8gb being adequate enough VRAM for running future titles, and well scalping and mining and the whole market got broke so it seems like people just stopped giving a damn ebcause they were happy to have literally ANYTHING that can game.

But fast forward about 2-3 years and you'll see everyone was right, the main thing that's shocking being how poorly RTX 3070 or 3070ti can do raytracing compared to the RX 6800 just because of its crippling VRAM limitations. So what that makes it is hilariously enough you should've bought the RX 6800 16gb if you wanted to enable full raytracing. 3070ti was just a bad product. It launched with way too high a pricetag and that's what made it not worth to me. I was waiting on the card announcement too and when I heard the MSRP and TDP I subsequently lost all interest or in hoping it would save the 30 series line. Instead it of being the foretold one to save Ampere, 3070ti ended up being the defining card for what sucked about Ampere and why 30 series failed.

It can use as much VRAM as it likes, as long as game doesn't look sh*t on the GPUs with 8GB of VRAM (that's the most widespread amount of VRAM on Steam followed by 6GB).
episoder Apr 11, 2023 @ 12:10pm 
Originally posted by papalazarou:
i only have 6gb and i can run the game fine with just under 10 gb in the red, it doesnt seem to matter too much

tbh i dunno how you manage that. i tested and maxed out yesternite. i can't get my card above the 6 gig limit in the options. (discarding the osapp stuff). say... i have high environment. if i raise the character textures to high it automatically set the environment to medium to stay within budget. how you run this on high in the red? and with 16 gigs of ram? sus at this point.
Originally posted by stage:
In a nutshell yes. 8gb is enough for High on 1080p in this game, but definitely no for 1400p+. The card is powerful enough however lacks ram. 3070 with 16gb vram would be rocking for many years to come, but alas...
Yeah this. Oh and it's not just HWUB btw, bitwit and Gamersnexus among others talked about it. What we all came to the conclusion of was it lasting for about a good two years before it "broke" under the strain of its planned obsolesence, and that "well if you can find it for MSRP it's a 'good' 'deal' compared to the scalpers and it's perfectly fine for right now but don't expect it to last."

I hear so much stupid ♥♥♥♥ on gaming and tech circles sometimes, like people that are insisting you get a 2060 6gb to "futureproof" because they didn't know how DLSS worked and thought it was going to make native resolution "obsolete" or that people insisting "30 series is better for 'futureproofing' because muh RT" obviously not realizing how much a crippling disadvantage their gimped VRAM was obviously going to turn out to be and that, worse some may argue, that gimped VRAM was effectively going to make it the worse raytracing card too. Not that it would matter, because by the time the next gen launches you're already having considerably worse RT than newer AMD anyway. Like a 7900XT just smokes it hands down. But now on top of that you have the ridiculous texture loading problems and well let me put it this way, I love the GTX 980 and Maxwell and Pascal series cards, and I still have to make the concession that The Division looks so much better on AMD hardware partly just because benchmarking the Division on a 980 causes bad texture popins from the limited VRAM.

nVidia users are just used to suffering this really, texture popin in and not having good GPU software is such an accepted fact of life that if you use nVidia literally the first thing you install on your system after windows is Afterburner, because nVidia's software is so terrible, is such. ♥♥♥♥. that you reflexively use 3rd party software than even bother with it. I mean I guess it's not too atrocious, I'm used to Windows XP/98 type stuff so it doesn't bother me as much, but it's just so much nicer to use Radeon software than having to rely on stupid control panel vs geforce experience vs afterburner and Riva tuner type problems. I think the average nVidia user kind of just rationalizes after the fact like an alcoholic why it was not in fact wasted money, but I always feel like I'm wasting money on nVidia longterm. this does not justify the RX 550 or the RX 6500XT though, those two cards sucked hard, and the Radeon VII was too overpriced although I'll admit despite how it sucked because of cost and other problems it turned out to make a really great mining card and a good gaming card so whatever nVidia largely floats itself on the misbegotten idea that "more expensive=better" and pouring all their money into astroturfing and the marketing division, good example being ♥♥♥♥♥♥ stuff like userbenchmarks hilarious cope replete with lies and the nonstop spamming on a certain cancerous site's equally cancerous technolo/g/y board, where drones get paid to ♥♥♥♥♥♥♥♥ about Chrome and nVidia 24/7 despite I can't think of any techie who actually likes and trusts Google ♥♥♥♥. The correct answer is Firefox with noscript and adblockers btw, preferably with duckduckgo.
Originally posted by MaxL:
Originally posted by Red Star, Blood Moon:
https://youtu.be/Rh7kFgHe21k?t=340
Literally yes

And also go check out how well Hogwart's is running. Or any of like a dozen other major newer games like even the latest Horizon Zero Dawn was having troubles iirc chewing through over 11gb of VRAM, and that's not counting older games like Doom Eternal that was using up to 11gb back in March 2019 when it released. So this is why I told people even all the way back then if already we are having problems with running a year old game then there's no way we're having 8gb being adequate enough VRAM for running future titles, and well scalping and mining and the whole market got broke so it seems like people just stopped giving a damn ebcause they were happy to have literally ANYTHING that can game.

But fast forward about 2-3 years and you'll see everyone was right, the main thing that's shocking being how poorly RTX 3070 or 3070ti can do raytracing compared to the RX 6800 just because of its crippling VRAM limitations. So what that makes it is hilariously enough you should've bought the RX 6800 16gb if you wanted to enable full raytracing. 3070ti was just a bad product. It launched with way too high a pricetag and that's what made it not worth to me. I was waiting on the card announcement too and when I heard the MSRP and TDP I subsequently lost all interest or in hoping it would save the 30 series line. Instead it of being the foretold one to save Ampere, 3070ti ended up being the defining card for what sucked about Ampere and why 30 series failed.

It can use as much VRAM as it likes, as long as game doesn't look sh*t on the GPUs with 8GB of VRAM (that's the most widespread amount of VRAM on Steam followed by 6GB).
I don't think you are understanding multiple things here, and I'll start with one of them I don't give two ♥♥♥♥♥ what Steam hardware survey says, there's still people on here using a dualcore and 768p displays, there are still people using HDD boot drives, and according to that survey the most common options are quadcores. That doesn't mean you should expect your potato PC to run everything at very high to ultra settings, period, and if you do, be grateful you have good enough hardware to do it on older games, NOT brand new titles and remasters. That's just basic common sense. This game is a remaster, not an old PS3 game. Same way Bioshock 2 remaster took so much beefier hardware although I thought the look on new Fable and new Bioshock sucked but I digress.

The second option being you are somehow expecting this for not just high end monitors but also running raytracing and lad, I know what it means to be coming from the darkness of ignorance :admech: where those blinded to the truths of the machine spirit didn't understand the difference between RAM and ask such stupid question as "can I reuse my DDR3 in a DDR4 motherboard." There is a certain baseline of terribly low knowledge and ignorance that pervades steam and gets worse the closer you drift to AAA and away from the light and safety of oldschool crpgs and 4x games, toward the dark tides, toward the console gamer. So you have people on here unironically gaming on 4k monitors that they have a piddly 8gb graphics card for who are not even realizing literally do not know there's a difference between VRAM and system RAM and get all butthurt at the game maker because it can't be their hardware. This is true in some case, that it is indeed the dev being lazy/incompetent, but there is other cases where if even the dev were "lazy stupid and greedy" you having a HDD boot drive is certainly not helping matters is it? Well there's the case, you have people operating on 30 series cards that have less than 12gb of VRAM and cranking all settings up to the max and then not understanding why it becomes a stuttery broken mess, because they seemingly didn't understand the first scam was the graphics card itself, because it ran fine for the first year they had it and did not know what "planned obsolesence" means, that is the corpos designing their ♥♥♥♥ to be bad or break on purpose after an amount of time to force you to buy new ones when it should've lasted you ages.

The problem is it literally is nVidia's bad drivers and nVidia's gimped VRAM. That is why it has been affecting 30 series owners so much more strongly than everybody else, because first of all nVidia's RTX 3000 cards had buggy drivers from the start--no it's not just AMD, nVidia is also well known for having bad drivers and all their RTX cards had some sort of driver problems even the RTX 2080ti had driver problems--and evidently they still are having weird driver problems for some reason, and secondly it is because they crippled their cards' vram on purpose to make you replace them faster as they age like milk.

It's already been two years and Jensen and the nV board of execs is sitting there on their yachts like "didn't you peasant gamers get the message? It's been two years already and I want a new Bugatti for my collection, chop chop wagie get back in your cagie and make us some moolah." They're acting like why is this paypig not buying a new GPU already clearly he had two years of fun at 8gb and time's up. Now you see also why some of us got so incensed we finally switched brands or saw Ampere as really bad value. I mainly feel bad for the parents that just wanted to do something nice for their kids and got completely ripped off, first by the megacorp, and then by the scalpers. I'm even angrier at the used office PC with GT710 sellers though. So many people just got robbed by the Chinese from that and the fake 1050tisif it ships from China do not buy it, no amount of alleged savings will help how much they are scamming you
Marty McFly Apr 11, 2023 @ 12:35pm 
IIn Linux, the game is consuming less VRAM. I think this high VRAM consumption is some kind of memory leak bug, because in Linux the VRAM consumption should be the same or even higher than in Windows since the game is running through Proton.
Cylokin Apr 11, 2023 @ 12:43pm 
Originally posted by Red Star, Blood Moon:
Originally posted by MaxL:

It can use as much VRAM as it likes, as long as game doesn't look sh*t on the GPUs with 8GB of VRAM (that's the most widespread amount of VRAM on Steam followed by 6GB).
I don't think you are understanding multiple things here, and I'll start with one of them I don't give two ♥♥♥♥♥ what Steam hardware survey says, there's still people on here using a dualcore and 768p displays, there are still people using HDD boot drives, and according to that survey the most common options are quadcores. That doesn't mean you should expect your potato PC to run everything at very high to ultra settings, period, and if you do, be grateful you have good enough hardware to do it on older games, NOT brand new titles and remasters. That's just basic common sense. This game is a remaster, not an old PS3 game. Same way Bioshock 2 remaster took so much beefier hardware although I thought the look on new Fable and new Bioshock sucked but I digress.

The second option being you are somehow expecting this for not just high end monitors but also running raytracing and lad, I know what it means to be coming from the darkness of ignorance :admech: where those blinded to the truths of the machine spirit didn't understand the difference between RAM and ask such stupid question as "can I reuse my DDR3 in a DDR4 motherboard." There is a certain baseline of terribly low knowledge and ignorance that pervades steam and gets worse the closer you drift to AAA and away from the light and safety of oldschool crpgs and 4x games, toward the dark tides, toward the console gamer. So you have people on here unironically gaming on 4k monitors that they have a piddly 8gb graphics card for who are not even realizing literally do not know there's a difference between VRAM and system RAM and get all butthurt at the game maker because it can't be their hardware. This is true in some case, that it is indeed the dev being lazy/incompetent, but there is other cases where if even the dev were "lazy stupid and greedy" you having a HDD boot drive is certainly not helping matters is it? Well there's the case, you have people operating on 30 series cards that have less than 12gb of VRAM and cranking all settings up to the max and then not understanding why it becomes a stuttery broken mess, because they seemingly didn't understand the first scam was the graphics card itself, because it ran fine for the first year they had it and did not know what "planned obsolesence" means, that is the corpos designing their ♥♥♥♥ to be bad or break on purpose after an amount of time to force you to buy new ones when it should've lasted you ages.

The problem is it literally is nVidia's bad drivers and nVidia's gimped VRAM. That is why it has been affecting 30 series owners so much more strongly than everybody else, because first of all nVidia's RTX 3000 cards had buggy drivers from the start--no it's not just AMD, nVidia is also well known for having bad drivers and all their RTX cards had some sort of driver problems even the RTX 2080ti had driver problems--and evidently they still are having weird driver problems for some reason, and secondly it is because they crippled their cards' vram on purpose to make you replace them faster as they age like milk.

It's already been two years and Jensen and the nV board of execs is sitting there on their yachts like "didn't you peasant gamers get the message? It's been two years already and I want a new Bugatti for my collection, chop chop wagie get back in your cagie and make us some moolah." They're acting like why is this paypig not buying a new GPU already clearly he had two years of fun at 8gb and time's up. Now you see also why some of us got so incensed we finally switched brands or saw Ampere as really bad value. I mainly feel bad for the parents that just wanted to do something nice for their kids and got completely ripped off, first by the megacorp, and then by the scalpers. I'm even angrier at the used office PC with GT710 sellers though. So many people just got robbed by the Chinese from that and the fake 1050tisif it ships from China do not buy it, no amount of alleged savings will help how much they are scamming you

Does every post of yours have to be an essay where 80% of it is you repeating how Nvidia 30 series users are stupid basically? You are repeating yourself over and over again ffs.

And you didn't get his point of course, you just parrot same ♥♥♥♥ over and over again. You said:

"I don't give two ♥♥♥♥♥ what Steam hardware survey says, there's still people on here using a dualcore and 768p displays, there are still people using HDD boot drives, and according to that survey the most common options are quadcores. That doesn't mean you should expect your potato PC to run everything at very high to ultra settings, period"

He meant VAST MAJORITY of users are on 8gb VRAM, you mentioning Dualcores or whatever makes no sense as a response to that, and with majority having that, Devs would lose tons of profit if they made 8gb obsolete. And no one said they want to run Ultra on 8gb, he literally said games just have to look good on 8gb, basically Medium/High combo or whatever. Some games look really good even on Low, while this game's textures on Medium look muddy like before PS2 era. There is no justification for that no matter how much you want to ♥♥♥♥ on 8gb users.
Last edited by Cylokin; Apr 11, 2023 @ 12:43pm
YoLolo69 Apr 11, 2023 @ 12:50pm 
With my RTX3070 (OC +125/+900), 32GB RAM and my i7 10700K (stock) I'm able to play smoothly at 1440p, 60fps everywhere with vsync ON, all settings Ultra but Texture Object Medium, Texture Character, Environment and Effect at High. With those settings VRAM in menu say around 9.350GB for 8GB available. DLSS set to Quality.
During game GPU usage is between 60% and 90%, CPU usage is average 50/60% all cores. The game is installed on a very old HDD, so first load is long, after it's OK. XBox One controller.
More or less 6 hours in game. No issue, no crash, no stutters.
Last edited by YoLolo69; Apr 11, 2023 @ 12:51pm
< >
Showing 1-15 of 16 comments
Per page: 1530 50

Date Posted: Apr 11, 2023 @ 11:06am
Posts: 16