The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
The problem isn't just TLoU...it's the VRAM on 3070 etc owners
https://youtu.be/QWeomP7aEGo?t=1727
We've literally known this was coming since before the cards even released. TLoU has other problems like optimizations and nVidia's broken drivers, but some of your crashing and stuttering problems are from trying to run ultra detail or very high resolutions.
https://www.youtube.com/watch?v=_lHiGlAWxio
It's an nVidia hardware problem too.

If you go to anywhere from Hogwarts to Doom Eternal, The Witcher 3 remaster, Dead Space remake, and iirc Callisto Protocol too, people are having performance problems on RTX 3070, 3070ti, 3060ti in particular and also 3080 10gb sometimes becoming a stuttery mess. This is because you're trying to run too high textures etc. and ran out of videocard memory. That is not something that can be patched.
Originally posted by FeilDOW:
People have got to start looking at requirements. It's 8gb cards for 1080p HIGH settings, 1440p high settings calls for 11gb card and 4k ultra is a 16gb card.
< >
Showing 46-60 of 84 comments
Qwarrock Apr 1, 2023 @ 4:31pm 
Originally posted by Jeff4u:
Originally posted by S1ntax Error:

It is a complicated issue mainly because users don't want to admit that the product they spent their hard earned money on, isn't holding up well 3 years later.

In all honesty and I mean no disrespect or offense but once the PS5 and Series X dropped with 16GB of GDDR6, the writing was on the wall. And if a user still made the decision to purchase an 8GB card after that, it was a bad purchasing decision. We can blame Nvidia for not putting 12 or 16GB on their mid range cards like AMD did (users could have also just bought a 6800 or 6800xt lol). We can blame game developers for making games that utilize what the consoles have to offer, but we won't blame our selves for not looking at where the market was clearly headed before making purchasing decisions.. We wont recognize that maybe we just need to turn down settings.

But on the other hand, Nvidia has admitted there is a driver problem effecting the 30xx series cards with this game specifically.

Fine for PS5 with 16G RAM, probably 12G is used for VRAM. This is for 4K gaming.

How much do you think for 1080p? This is more on game design that can restrict 8G for that.

Remember, there are customer base out there, you can force them or just lose sales. The recommended spec of 1080p is 8G. Just see the product page.

Il you go UTRA for texture, you shouldn't be surprised. Ultra texture (or any other lvl) are the same for any res. So using it for 1080p is 1/ Waste of Vram 2/ Waste (because you'll not see lot of diff with "high" or even less in 1080p)
Same for many other option. Just stop thinking res is the only value who impact perf.

I really think equivalent of ps5 is "High" not "Ultra". (and less for 60fps stable)
Last edited by Qwarrock; Apr 1, 2023 @ 4:34pm
Bucherebis Apr 1, 2023 @ 4:36pm 
Originally posted by Lara Croft:
Originally posted by Red Star, Blood Moon:
Cry harder, literally can't refute one single fact, so resorts to stalking personal profiles hoping he can find just *something* to ignore how hard he gets owned. Basically the moment someone starts stalking profiles instead proves they lost the argument.

Your graphics card is an overpriced potato, deal with it. You're going to have to either turn settings down or accept any new game you run on a 30 series card is going to be a stuttering mess, just like it is for all these other new games.
Lol holy heck dude, you're still posting? I have been out running errands like for 4 hours and you're still here talking down and aggressive to people. Gosh dang. You are unhinged.
Like I said, they are feeding these kids too much stupid juice.
Originally posted by Full Loot PVP:
Originally posted by Red Star, Blood Moon:
They're not. I'd go so far as to argue AMD Radeon was best in the business in 2020. RX 6000 series cards were and still are beastly, and nVidia had nothing but problems on Ampere. It just wasn't a good generation, and we still have driver problems with 30 series cards to this day. Ampere's drivers always have been bugged, and they had to nerf 3080 drivers too to stop the blackscreening and crashing. In terms of actual performance, they completely suck for the pricetag. 6800 was slightly cheaper and much more powerful and much better built. The fact there's this uproar over TLoU affecting 3070 owners so hard speaks volumes. It's even funnier that people are being forced to enable FSR on their nVidia cards just to stop the crashing from DLSS XD
No, what nVidia does is create this image of themselves and through mass marketing aka corporate propaganda cement a certain type of an image in the same way Dell's ewaste gets shoveled into every big company and the same way clueless technically inept, computer illiterate liberal boomers buy Apple garbage. I've talked to these people, they'll tell you insane ♥♥♥♥ that "Apple is just better" and you ask them what an SSD is, they don't know. They don't know what RAM is, they don't really know what a CPU is, and yet they are convinced that Apple must surely be better because they are smooth brains and the propaganda worked on them.

This is the same thing. Ampere was just inadequate. 3060 was one of the few products that was mostly inoffensive but even that thing could only offer at best the same performance as a 5700XT, except it cost the same as a 5700XT. So what nVidia basically did is offer exactly the same performance and same price as a midrange AMD card two years later, and idiots ate it up. A lot of tech channels criticized them for this too, because it was so expensive for how underperforming it was, and the same was reflected across the entire stack.

Now today we have this "best in the business" ♥♥♥♥ heap running into major VRAM bottlenecking across a wide array of titles, just like everyone said it would within 2-3 years of launch. A really good product ages gracefully, like the 1080ti. Nobody would argue the GTX 1080ti wasn't one of the best GPUs ever made, still cost a ton, but it was seemingly saner compared to everything that came after it, and it aged so well if you had a 1080ti in 2017 you can still play with it just fine today. That's not the case with RTX 3070 at all, which in half that time is running into nonstop problems wherever it goes. It was basically built to fail in two years, on purpose. Calling nVidia the best in the business is like calling Apple the best in the business. It's not. It's jsut the majority of people are smoothbrained halfwits who don't understand technology and have no interest to learn. Somehow I feel like zoomers and boomers are equally like this, not sure why but both gens seem to only care about ipads


Originally posted by S1ntax Error:

Nvidia mind share at best. The 6800xt was easily the best performance to value card of last generation and people still bought the 3070/ti over it.

AMD is usually always competitive in the low end to upper midrange. Last Gen they were competitive across the board outside of ray tracing of course.

You can bet the 12GB 6700/6750xt owners and the 16GB 6800/xt owners will be happier with their purchase than the 3060/3070(ti) owners lol.
Nvidia is the best in the business because it has best drivers.

Nvidia is the way to go if you want to play NEW games. AND ANY OLD GAME.

AMD has ♥♥♥♥♥♥ drivers when it comes to older games. Nvidia accumulates all their drivers.

If it was not up to the drivers then it would be neither Nvidia nor AMD.

It would be Intel with ARC series. Unfortunately, ARC series does not know how to play OLDER games AT ALL. Because they were not manufacturing back when. But ARC does better with newer titles while using less.
This is a complete ♥♥♥♥♥♥♥ lie, I literally can't even play Warhammer 40k Gladius on laptop because of nVidia's shtty ♥♥♥♥♥♥♥ drivers. Vulkan is unsupported, and also nobody even argues their software doesn't suck, because it does suck. I hate using geforce control panel or experience, it's just awful. And their drivers absolutely blow. Some of you keep saying this same mindless halfwit nonsense about drivers when it's plain as day 30 series drivers ♥♥♥♥♥♥♥ suck, they are awful and nVidia even acknowledged that's part of the problem with TLoU is their drivers are completely busted. But you guys somehow got conned into blaming anything and everything for a hardware and driver issue. This isn't new either, Ampere had bad drivers since the very start. I've not even heard of RDNA2 having bad drivers, and pretty much all the tech channels acknowledged that. It's just one of those things people parrot endlessly without thinking about, and often without even experiencing it. Out of all the cards I've used it was actually nVidia which had the most irrititating driver problems. My ATI Radeon and AMD cards worked perfectly fine.

Basically what you're all doing is making an excuse and rationalization at this point, because you know nVidia hardware is objectively and factually not that good. This is a really sore point for anybody that paid scalper prices on nVidia products. I'd be pretty mad and coping hard too if I'd paid $1000 USD on a crappy 3070. If nVidia actually made such truly wonderful unassailably good hardware people would only talk about the performance. You can tell a thing sucks because people end up looking like first year college students trying to b.s. their way through an exam, bringing up unrelated things like drivers or some gimmick or the looks of it, because the fundamentals are subpar and shoddy. This includes the inferior performance of the 3070ti when the 6800 smokes it for $20 less, but that's not even the real problem. The big problem is it has literally half the VRAM, and that makes it unsuitable for anything but 1080p longterm, whereas 6800 is still a good performing 1440p card. It's such a huge problem because RT takes massive amounts of memory, so the crying all over TLoU about performance issues is pretty exemplary of why you need to have enough VRAM. TLoU is taking up to 15gb of VRAM at the higher resolutions and ultra quality--in other words nearly double the 3070's VRAM. Which renders the whole RT thing quite moot if it technically has faster RT but overall is slower and a stuttery mess because it hasn't got adequate VRAM to do the RT.

No one is holding nVidia accountable and if you're all just sucking on nVidia's nuts it's really no wonder they keep doing stupid ♥♥♥♥ like charging $900 for a high midrange card or charging high end prices for a midrange card and then not sticking adequate VRAM on it. Absolute epitome of the type of person who wants to "own nothing and be happy." Then either be happy with what you have and accept its inferior performance for more money, or quit trying to slam every developer out there for your shoddy VRAM.
S1ntax Error Apr 1, 2023 @ 4:56pm 
Originally posted by Full Loot PVP:
Nvidia is the best in the business because it has best drivers.

Nvidia is the way to go if you want to play NEW games. AND ANY OLD GAME.

AMD has ♥♥♥♥♥♥ drivers when it comes to older games. Nvidia accumulates all their drivers.

If it was not up to the drivers then it would be neither Nvidia nor AMD.

It would be Intel with ARC series. Unfortunately, ARC series does not know how to play OLDER games AT ALL. Because they were not manufacturing back when. But ARC does better with newer titles while using less.

As someone who went from a 6800xt to a 4090, I disagree. I have experienced ♥♥♥♥♥♥ drivers from both camps. But I honestly miss the feature set I had with AMD drivers.

The overlay was a lot more customizable and being able to have per game overclock profiles was super nice. I was fond of AMD link as well.

And yes I know I can do all the same stuff on my 4090 with third party software, but having it all in the driver was nice and I miss it lol. I also hate that GeForce experience requires an account.
Last edited by S1ntax Error; Apr 1, 2023 @ 4:57pm
Alchemy Of Seoul Apr 1, 2023 @ 5:01pm 
For what it is worth, for those claiming that a 3070 struggles with games like Hogwarts, I have a 3070 running Hogwarts on Ultra 1080. GPU is at 64c and at a high of 72c. No stuttering, Game looks beautiful. I have a triple monitor set up with other browsers while streaming twitch and with GPU/CPU programs running to watch heat and vram usage. vram never gets over 87%.

When I play TLoU I don't get close to running out of vram. This game is poorly optimized and or filled with bugs or as ND admitted a nasty memory leak is present. I run this game on the lowest settings and it still crashes. Sorry but not buying this argument it is the hardware.

Edit: in Hogwarts I am getting up to 130 FPS.
Last edited by Alchemy Of Seoul; Apr 1, 2023 @ 5:03pm
Qwarrock Apr 1, 2023 @ 5:03pm 
Originally posted by S1ntax Error:
I also hate that GeForce experience requires an account.

You can download Nvdia driver without GeForce Experience. You don't need GeForce Experience, or any account to just dl and use Nvidia Driver.
GeForce xp is here for cloud gaming, first. which, of course, requires an account.. and yes it also have the option to automatically dl and install new driver
Last edited by Qwarrock; Apr 1, 2023 @ 5:06pm
S1ntax Error Apr 1, 2023 @ 5:36pm 
Originally posted by 🤘 Qwarrock 🤘:

You can download Nvdia driver without GeForce Experience. You don't need GeForce Experience, or any account to just dl and use Nvidia Driver.
GeForce xp is here for cloud gaming, first. which, of course, requires an account.. and yes it also have the option to automatically dl and install new driver

Yes yes yes I know, but that doesn't change the fact that it annoys me lol.


Originally posted by Full Loot PVP:
I had a 180 experience from yours. AMD "panel" substitute did not even have ability to force vsync on an older game and no ability like set an amount of FPS etc. Nvidia panel has it all.

You would use "Radeon chill" to limit frame rate and you could force vsync in the same menu you enable enhanced sync. But to be fair I don't exactly know when AMD added those features, it could have been after you jumped ship.

But even still that's my point. Two completely different people with completely different experiences. Makes it hard to say that one is objectively better than the other.

Complete user error on my part but because I was so used to AMD Radeon settings, when I switched to Nvidia it legit took me like an hour to find where fast sync was to enable it.
✪ GIGO Apr 1, 2023 @ 5:43pm 
Originally posted by Red Star, Blood Moon:
Originally posted by ✪ GIGO:
8gb is already overkill... you can even mod older games to look better than newer games and it will run smooth like butter. This is the result of lazy developers taking advantage of higher VRAM existance.
>my potato old 1gb card can't stop stuttering and crashing
>it's the dev's fault for being lazy and not catering to my own entitled needs and designing their game exclusively around working on my potato
8gb was introduced with the freaking RX 400 series dude. It's been around longer than most other memory configurations. VRAM usually doubles every 4 years. nVidia simply has been even more blatantly and transparently greedy, and also lazy and cutting corners everywhere to increase their profit margins at your expense. Like they don't even hide it at all, just look at the insane price of 4070ti, it's an insult. They do this after all the other anti-gamer practices like scalping cards directly to miners or wasting Ampere chips on cards designed just to be ewaste so it doesn't end up on the secondhand market. nVidia artificially constricts their own supply just to manipulate prices higher. It's a ♥♥♥♥♥♥ company. People only put up with it because AMD was pretty meh tier for awhile, but since RDNA they've been a solid competitor again. I have no idea why gamers are so stupid they keep giving this company their money.

The 1070 was an 8gb card. So was the 2070, which is about when they should've put that to rest and bumped up to 12-16gb for 3070 and 4070. They didn't, because they are cheap and greedy and wanted to force you to rebuy GPU in two years when your GPU ages like milk. A lot of people actually bought a 6800 for no other reason than the memory worries, and that fear was proven true. You simply cannot run all ultra settings with RT on on a RTX 3070 or 3070ti.

LOOL thats like trying to compare an 8 gb ddr2 stick with an 8gb ddr5 one. You are either ignorant or trying to cope about your overpriced GPU.
Arin Evergale Apr 9, 2023 @ 9:08pm 
Originally posted by Red Star, Blood Moon:
https://youtu.be/QWeomP7aEGo?t=1727
We've literally known this was coming since before the cards even released. TLoU has other problems like optimizations and nVidia's broken drivers, but some of your crashing and stuttering problems are from trying to run ultra detail or very high resolutions.
https://www.youtube.com/watch?v=_lHiGlAWxio
It's an nVidia hardware problem too.

If you go to anywhere from Hogwarts to Doom Eternal, The Witcher 3 remaster, Dead Space remake, and iirc Callisto Protocol too, people are having performance problems on RTX 3070, 3070ti, 3060ti in particular and also 3080 10gb sometimes becoming a stuttery mess. This is because you're trying to run too high textures etc. and ran out of videocard memory. That is not something that can be patched.

It's literally not a GPU issue at all, even at 720p lowest settings it still stutters, It's a cpu issue, I run an r5 3600 overclocked, and it performs better than the recommended 3600x at stock. but whenever the last of us is booted it pegs it to 100% usage no matter what
Vox Maximus Apr 9, 2023 @ 9:40pm 
Originally posted by SkillfulDeep:
Originally posted by Red Star, Blood Moon:
https://youtu.be/QWeomP7aEGo?t=1727
We've literally known this was coming since before the cards even released. TLoU has other problems like optimizations and nVidia's broken drivers, but some of your crashing and stuttering problems are from trying to run ultra detail or very high resolutions.
https://www.youtube.com/watch?v=_lHiGlAWxio
It's an nVidia hardware problem too.

If you go to anywhere from Hogwarts to Doom Eternal, The Witcher 3 remaster, Dead Space remake, and iirc Callisto Protocol too, people are having performance problems on RTX 3070, 3070ti, 3060ti in particular and also 3080 10gb sometimes becoming a stuttery mess. This is because you're trying to run too high textures etc. and ran out of videocard memory. That is not something that can be patched.

It's literally not a GPU issue at all, even at 720p lowest settings it still stutters, It's a cpu issue, I run an r5 3600 overclocked, and it performs better than the recommended 3600x at stock. but whenever the last of us is booted it pegs it to 100% usage no matter what

This game is extremely CPU intensive, so you are likely correct in your assumption
StandAlone Apr 9, 2023 @ 10:37pm 
The funny is.. they add minimum requirement for 4GB Vram GPU like 1050ti... but even the lowest setting with 720p still spend 5gb more Vram... it is a joke??
Vox Maximus Apr 9, 2023 @ 10:48pm 
Originally posted by StandAlone:
The funny is.. they add minimum requirement for 4GB Vram GPU like 1050ti... but even the lowest setting with 720p still spend 5gb more Vram... it is a joke??

But is it playable?
Lince_SPAIN Apr 9, 2023 @ 10:51pm 
Originally posted by Full Loot PVP:
Game looks beautiful with medium textures just like TLOU2 was on PS4 PRO

It looks awful with anything below high textures, TLOU2 PS4 would have been lapidated had it looked like TLOU PC on medium settings. Either you have low standards, bad memory or both.
Genetic Apr 9, 2023 @ 11:14pm 
The Vram is not the issue, the port is a DISASTER, it waste CPU cycles and is not efficient using resources...
Originally posted by SkillfulDeep:
Originally posted by Red Star, Blood Moon:
https://youtu.be/QWeomP7aEGo?t=1727
We've literally known this was coming since before the cards even released. TLoU has other problems like optimizations and nVidia's broken drivers, but some of your crashing and stuttering problems are from trying to run ultra detail or very high resolutions.
https://www.youtube.com/watch?v=_lHiGlAWxio
It's an nVidia hardware problem too.

If you go to anywhere from Hogwarts to Doom Eternal, The Witcher 3 remaster, Dead Space remake, and iirc Callisto Protocol too, people are having performance problems on RTX 3070, 3070ti, 3060ti in particular and also 3080 10gb sometimes becoming a stuttery mess. This is because you're trying to run too high textures etc. and ran out of videocard memory. That is not something that can be patched.

It's literally not a GPU issue at all, even at 720p lowest settings it still stutters, It's a cpu issue, I run an r5 3600 overclocked, and it performs better than the recommended 3600x at stock. but whenever the last of us is booted it pegs it to 100% usage no matter what
Oh yeah that too, look there's a ♥♥♥♥♥♥♥♥♥♥♥ of problems going on and the majority are related to, you guessed it, 30 series GPU, mainly of the 8gb variety but also 10gb for those who are trying to run this game at 4k with RT. The other problem being, aside from nVidia's ♥♥♥♥♥♥ drivers and nVidia's bad VRAM, that there is a known about issue with the way how RTX cards particularly Ampere end up overloading instructions to CPU and bloat from driver overhead. And so while that's not even the biggest deal and is afflicting 12900k owners too iirc it's also compounding the problem by the way 30 series cards need a really high end CPU otherwise they get gimped; I've seen reports of 3070 owners upgrading from 5600XT or 1070 and actually having even worse performance after the upgrade, because it requires a really powerful CPU too.

Originally posted by Genetic:
The Vram is not the issue, the port is a DISASTER, it waste CPU cycles and is not efficient using resources...
You guys will remember these threads someday when you try and run yet another AAA title and eventually coming to realize that 8gb was not enough and you can't patch VRAM. I almost never see someone turning the settings down.

How am I even back here I was trying to find something on the Rockstar launcher login
< >
Showing 46-60 of 84 comments
Per page: 1530 50

Date Posted: Apr 1, 2023 @ 12:17pm
Posts: 84