The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
metalmario Apr 11, 2023 @ 10:20am
If your game crashes on your Nvidia card, try the latest Studio drivers
Switched from Game Ready drivers to Studio and it fixed the crashes
< >
Showing 16-30 of 35 comments
MancSoulja Apr 11, 2023 @ 11:32am 
Originally posted by LuanBK:
My PC has 32GB of RAM, VRAM does not exceed the recommended limit but I still have random game crash, does anyone know why? Nvidia driver has been updated to the latest.

Are you using the hotfix driver?
_Atr4Mør5 Apr 11, 2023 @ 11:32am 
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Even if you don't know something, that's no reason to call someone stupid.
I had zero crashes. ZERO. That's mean your PC is trash.

You don't even own the game, back under your bridge.
Another one. Did you know that the game for sale not only in Steam?
AirRevenant Apr 11, 2023 @ 11:34am 
Originally posted by _Atr4Mør5:
Originally posted by MancSoulja:

You don't even own the game, back under your bridge.
Another one. Did you know that the game for sale not only in Steam?
Epic Retort.
Last edited by AirRevenant; Apr 11, 2023 @ 11:35am
Originally posted by MancSoulja:
Originally posted by Red Star, Blood Moon:
Snip

Our 30 series cards are nearly 3 year old now, they're not new, we paid a good price and got 3-4 years of gaming out of them, it will a decade before they're obsolete and most of us will upgrade next gen.

No one was deceived, no one was ripped off, you're just a conspiracy nut.
You're one of the most completely uninformed soy boying simps I seen if you seriously think planned obsolesence is being "a conspiracy nut" you HAVE to be an American, or possibly a British or Canadian if I am being charitable. 3 years is ♥♥♥♥. Absolute. Freaking. ♥♥♥♥♥♥♥♥♥. Particularly for something so high costing as a 3070ti, that's a freaking $600 graphics card that should've cost $400-500 like 70 ti did before, and it still had ancient af 8gb VRAM. We had 8gb of VRAM back in 2015. That's just pathetic. No, it's not supposed to break or go bad in that little time dude. A 700 level card should be lasting you at least 6 years. The level of low expectations is truly some bug eating nonsense right here.



Originally posted by Hypnotickiller:
Originally posted by Red Star, Blood Moon:
https://www.youtube.com/watch?v=Rh7kFgHe21k
Yes it is your graphics card PC enthusiasts warned you about the ticking time bomb in your PC for years now and you didn't listen. PROTIP "more expensive" does not actually mean "better." Wild I know, but companies out there routinely are scamming people based on brand. Like even air mattresses, my ♥♥♥♥♥♥♥ Ozark Trail died after one trip, cheaper Intex will still last. But OT is a known meme camping company so most people buy it because they'll go on one camping trip the thing doesn't die until the second time and they go oh well and buy another, not realizing Ozark Trail is GARBAGE. That's exactly the same thing with HP Printers and Apple laptops and Dell Aurora "alienware" crap, you pay way more for an Alienware Dell than virtually any other prebuilt and yet it's WORSE than literally every single other prebuilt out there. https://www.youtube.com/watch?v=DY1dlVPzUVo This is why you don't make thousand dollar hardware purchases blindly based purely on brand name recognition--because that is how you get scammed.

Fact is you got got. nVidia's hardware just is not worth it. Yes, your RTX 3070ti is going to have even worse performance and probably even worse raytracing than an RX 6800 just because you have half the VRAM the 6800 has and that is why every single game you play that has raytracing enabled and lots of eye candy is going to be a complete stuttering mess at 1440p and increasingly 1080p all ultra. That's not ND's fault. That's your hardware aging really really badly because you evidently didn't know that nVidia is so well known for scamming people about VRAM they got sued for it https://www.pcinvasion.com/nvidia-pays-gtx-970-vram-lawsuit-settlement/ and LOST despite being a multibillion company with a fleet of the highest price lawyers, they are so scammy they STILL lost when they got sued over it, and that was AFTER scamming people with the GTX 770 2gb and 780ti 3gb which are synonymous with "aging like milk" in the GPU space. Like literally the 770 is completely unplayable not because of its slow speed but because there isn't enough VRAM strapped to the GPU core and that made it age out immediately thus forcing nVidia fanboys to upgrade 2-4 years later. That makes even the 780ti despite being freaking expensive af basically devalue to nothing within years.

Your graphics card is in the same position. It will not play anything brand new above 1080p at ultra settings. I'm sorry for you to have found out the hardware and I am guessing this was the first game you truly noticed it, but it will not be the last. Every month from here on out will see some new release that can't be played on your card at all ultra settings on 4k and 1440p monitors because it doesn't have the VRAM. Again please see the above video where they test this and outright replicated all the problems 30 owners are having.

I'm not saying it isn't expensive--I know you overpaid for it. I am saying it wasn't worth what you paid for it, and that was a foolish purchase decision if you weren't planning on replacing your nVidia graphics card every 2 years. nVidia's midrange and budget hardware isn't meant to be playable at ultra settings longer than two years; that's not the way it's meant to be played. It's designed to go obsolete in 2-3 years right around the time you have a new GraphicsTM ProductTM to purchase for the low price of $800 USD for replacing your 2070 super or 3070ti. We are sorry for the inconvenience and your wallet is a value customer thanks for understanding.this is literally why so many people switched to Radeon that generation and got a 6800 or 6800xt instead because we all know it was going to age like a turd with so little vram and that's why so many people didn't get 3070, 3070ti, or 3080, because they didn't have anywhere near enough vram for 4k or 1440p ultrawide, the crashing and stuttering is the result




again, REGARDLESS of graphic settings (even set to low) uses about 5 gigs of vram, so have 3 gigs spare and still crashes. ALL MY OTHER STEAM GAMES runs perfectly fine. in my case its nothing to do with vram.
Yeah, all the other games you tried so far and that were made not recently I'm sure do work fine, for just about the same reason I can get Witcher 3 original version to work with a GTX 700 series laptop or whatever on high settings. That doesn't mean a GTX 770 is going to be able to run Cyberpunk though. I know for a fact a huge portion of your library is going to be older games so all those games are completely irrelevant. All that counts is NEWER games. And of those titles I guarantee if you tried to run even a slightly older but still new game like Doom Eternal at 1440p ultra your graphics card would start to struggle--not because the GPU die itself is too slow, but because your VRAM configuration was deliberately designed to be as corn cuttingly low end as possible while still being unnoticeable as a problem until the next gen launch. Which again, literally has been what nvidia always does and will do stretching back decades to their 1gb cards being clearly not made with the future in mind.

If someone misled you into believing you are "futureproofing" by going with nVidia then I feel sorry for you, for being that gullible, because the person was wrong or a shill. Current gen RDNA3 raytracing was always going to comparable or better than last gen cards like Ampere, so whatever advantage Ampere had for RT in the past, is erased now, and more importantly because of its limited VRAM makes it perform more poorly than RX 6000 cards because RT is VRAM intensive and so altogether it always was clear that 30 series was going to age badly. I and many others on this forum were involved in those exact tech discussions three years ago when Ampere was announced and nearing release found out about its VRAM, and in tech circles we argued about this. Some fools called it "futureproofing with RT" which I always insisted was stupid just because last gen nVidia RT was likely to be slower than RDNAx anyway, but what I personally didn't count on was that the VRAM limitation was going to be so crippling so fast that RDNA2 cards now outperform nVidia on raytracing.

The simple matter is that knowing nVidia buyers simply replace their cards every 2 years and sell it to some sucker to make the newest purchase affordable. They often sell their old card off to help pay for the new one because nVidia can be slightly faster than AMD and at times much more expensive but it ages really badly usually, and 30 series is aging legendarily badly just like we all knew it would. It will age just Kepler to the point of total irrelevancy, and people having 150 game libraries in 2016 probably could still run most of their games on GTX 780's just fine at the time, but that wouldn't have stopped the newest 2016 games running like ♥♥♥♥♥♥♥♥ on a 780's paltry 3gb VRAM when all settings are turned up. This is why people often buy Radeon GPUs instead, because Radeon often tends to be much cheaper while delivering comparable performance, but usually aging much better and that is why AMD or ATI/Radeon users typically held onto their cards for many more years without replacing it, because AMD users fully expect to get 4-6 years out of their cards at very high settings minimum depending what level.

I don't know how else to explain the facts to you dude
https://youtu.be/Rh7kFgHe21k?t=340
The midrange 30 series are absolutely pathetic at running this game and that's all down to AMD having literally double the VRAM on their comparable cards like the 6800 vs 3070ti which btw the 3070ti was an arse product anyway because at launch it already had a $100 higher MSRP than the 2070 super which a lot of people switched to AMD just because their $400 card was comparable to nVidia's $500 card with the only heavy advantage being if you wanted to RT or not. Since you could enable RT on AMD, the only reason to get a 3070ti was RT, because the 3070ti was slower than an RX 6800 overall but it did raytracing somewhat faster at launch, hence the choice was do you want really good raytracing right now and a ticking time bomb in your computer later, or do you want overall faster performance for $20 cheaper and much better longevity thanks to double the VRAM.

Again, if you want part of those tech discussions I feel sorry for you, but those are the hardware realities. Your 3070ti is going to keep aging super badly particularly at RT ultra and anything above 1080p. It's still a perfectly fine 1080p card, it just can't be doing high end raytracing or any kind of nonstuttery performance in the future thanks to having too little VRAM as it was deisgned.
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Another one. Did you know that the game for sale not only in Steam?

Yes, and I also know you've collected a massive 250 games in nearly two decades, I doubt you could afford the internet required to download the game, let alone the price of entry.
This is why people are making fun of you guys XD
Seriously you want to sit there and "flex" on people who have 250 games in their library because you have dated 8gb potato in your PC. And that potato in your PC wasn't even high end when it got launched XD
If you're going to make fun of people for being "poor" try not to do it with completely inferior hardware yourself. Also between GOG, Epic, and Steam I've got over 700 games, kid. Why are you so poor you can only afford half the games as me and only afford half the VRAM?))))))
_Atr4Mør5 Apr 11, 2023 @ 11:40am 
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Another one. Did you know that the game for sale not only in Steam?

Yes, and I also know you've collected a massive 250 games in nearly two decades, I doubt you could afford the internet required to download the game, let alone the price of entry.
500mb/s for 6$ is enough?
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Another one. Did you know that the game for sale not only in Steam?

That's why people like you come here and try to shame others, because you're dissatisfied with your own pathetic existence and you're desperately looking for a way to make yourself feel more relevant.
Projection inbound cap'n!

I'm personally just here because I know the average steam user is a complete idiot, and that the average AAA preordering moutbreather is epically an idiot and that therefore is going to be a hotbed of tech illiteracy and tech ignorance and the Omnisiah demands that I spread technical truth. And part of that technical truth is the fact that 8gb was a standard for 1080p gaming introduced in literally the mid 2010s, and so it was obvious nVidia cut way too close this time and was rolling out completely gimped potatoPUs all the way up to high end like the 10gb 3080.

Stupid kids did not know this, and probably bought really expensive cards that were going to age like curdled dog turds as a result. I am here because people are polluting steam with their ignorance, and in turn not only sabotaging working man's efforts but trying to force me to endure yet another ♥♥♥♥♥♥ E3 launch lie where everything from The Division to Vampyr to Watchdogs look like a cartoon to make sure people's potato hardware can run it, and I'm not going to back down and say yeah let's nerf all visuals because nVidia potato owners can't handle high end graphics settings thanks to their crippled GPUs. I have never insisted nVidia nerf Hairworks to run it on something else. I don't particularly care for it and I make fun of the performance hit, but I'm not going to just demand Hairworks and RT get nerfed to work on worse hardware. Likewise, your potato nVidia GPU cannot handle full high res texture packs, deal with it. This happened because you rejected the truth of the omnisiah, and are perishing in the stutters of your darkness and ignorance.
:steamsalty:
_Atr4Mør5 Apr 11, 2023 @ 11:48am 
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Another one. Did you know that the game for sale not only in Steam?

That's why people like you come here and try to shame others, because you're dissatisfied with your own pathetic existence and you're desperately looking for a way to make yourself feel more relevant.
Knowing how crappy PCs are and how people build their crappy PCs, I immediately said that the problem is on your side, not in the game. Before you say anything to me, you should have studied why the hardware may not be stable.
Last edited by _Atr4Mør5; Apr 11, 2023 @ 11:49am
SolidBoss7 Apr 11, 2023 @ 11:50am 
Originally posted by Hypnotickiller:
Originally posted by _Atr4Mør5:
If your game crashes on your Nvidia card use some stress-test with monitoring app like HWInfo and find the problem of your crap PC.

That's it.


You clearly have the IQ of a 10 year old. Tell me why almost 300 of my steam games runs FLAWLESSLY and the last of us is the only one that CONSISTENTLY crashes (regardless of settings) I guess my 13700k 32 gigs of ddr5 and a 3070ti is my "crap pc" stop being so ignorant and grow up child.
Because TLOU pushes your CPU more than any other game you own and your system becomes unstable under max load. Simple. User issue.
MancSoulja Apr 11, 2023 @ 11:50am 
Originally posted by Red Star, Blood Moon:
Originally posted by MancSoulja:

That's why people like you come here and try to shame others, because you're dissatisfied with your own pathetic existence and you're desperately looking for a way to make yourself feel more relevant.
Projection inbound cap'n!

I'm personally just here because I know the average steam user is a complete idiot, and that the average AAA preordering moutbreather is epically an idiot and that therefore is going to be a hotbed of tech illiteracy and tech ignorance and the Omnisiah demands that I spread technical truth. And part of that technical truth is the fact that 8gb was a standard for 1080p gaming introduced in literally the mid 2010s, and so it was obvious nVidia cut way too close this time and was rolling out completely gimped potatoPUs all the way up to high end like the 10gb 3080.

Stupid kids did not know this, and probably bought really expensive cards that were going to age like curdled dog turds as a result. I am here because people are polluting steam with their ignorance, and in turn not only sabotaging working man's efforts but trying to force me to endure yet another ♥♥♥♥♥♥ E3 launch lie where everything from The Division to Vampyr to Watchdogs look like a cartoon to make sure people's potato hardware can run it, and I'm not going to back down and say yeah let's nerf all visuals because nVidia potato owners can't handle high end graphics settings thanks to their crippled GPUs. I have never insisted nVidia nerf Hairworks to run it on something else. I don't particularly care for it and I make fun of the performance hit, but I'm not going to just demand Hairworks and RT get nerfed to work on worse hardware. Likewise, your potato nVidia GPU cannot handle full high res texture packs, deal with it. This happened because you rejected the truth of the omnisiah, and are perishing in the stutters of your darkness and ignorance.
:steamsalty:

I can just see the tears rolling down your cheeks as you angrily punched this entire TLDR into your Logitech keyboard. :tgrin:
MancSoulja Apr 11, 2023 @ 11:54am 
Originally posted by _Atr4Mør5:
Originally posted by MancSoulja:

That's why people like you come here and try to shame others, because you're dissatisfied with your own pathetic existence and you're desperately looking for a way to make yourself feel more relevant.
Knowing how crappy PCs are and how people build their crappy PCs, I immediately said that the problem is on your side, not in the game. Before you say anything to me, you should have studied why the hardware may not be stable.


Originally posted by SolidBoss7:
Originally posted by Hypnotickiller:


You clearly have the IQ of a 10 year old. Tell me why almost 300 of my steam games runs FLAWLESSLY and the last of us is the only one that CONSISTENTLY crashes (regardless of settings) I guess my 13700k 32 gigs of ddr5 and a 3070ti is my "crap pc" stop being so ignorant and grow up child.
Because TLOU pushes your CPU more than any other game you own and your system becomes unstable under max load. Simple. User issue.

The crash is the Nvidia kernel, it's a driver error.

Originally posted by _Atr4Mør5:
I immediately said that the problem is on your side, not in the game

That's how I know you're completely clueless, anyone with an ounce of knowledge would first try to identify the crash and and the application crashing, it helps avoid embarrassing moments like this when you out yourself as an idiot.
Last edited by MancSoulja; Apr 11, 2023 @ 11:54am
_Atr4Mør5 Apr 11, 2023 @ 11:57am 
Originally posted by MancSoulja:
The crash is the Nvidia kernel, it's a driver error.
Wrong. Even on reddit it is written that any nvlddmkm error can be due to PSU/CPU/RAM... It's basically anything that causes you to lose communication with the GPU and you get an nvlddmkm as a result. Only this is a wrong pointer. And the fact that you don't know it only confirms the fact that you are noob.

Originally posted by MancSoulja:
That's how I know you're completely clueless, anyone with an ounce of knowledge would first try to identify the crash and and the application crashing, it helps avoid embarrassing moments like this when you out yourself as an idiot.
That's how I know you understand absolutely nothing about how PC hardware works
Last edited by _Atr4Mør5; Apr 11, 2023 @ 11:57am
MancSoulja Apr 11, 2023 @ 11:59am 
Originally posted by _Atr4Mør5:
Originally posted by MancSoulja:
The crash is the Nvidia kernel, it's a driver error.
Wrong. Even on reddit it is written that any nvlddmkm error can be due to PSU/CPU/RAM... It's basically anything that causes you to lose communication with the GPU and you get an nvlddmkm as a result. Only this is a wrong pointer. And the fact that you don't know it only confirms the fact that you are noob.

Originally posted by MancSoulja:
That's how I know you're completely clueless, anyone with an ounce of knowledge would first try to identify the crash and and the application crashing, it helps avoid embarrassing moments like this when you out yourself as an idiot.
That's how I know you understand absolutely nothing about how PC hardware works

'Even on Reddit' :tgrin: :tgrin: :tgrin::tgrin::tgrin::tgrin:

Is this a joke? You're a comedian right and this is a YouTube skit? :tgrin:
_Atr4Mør5 Apr 11, 2023 @ 12:02pm 
Originally posted by MancSoulja:
Originally posted by _Atr4Mør5:
Wrong. Even on reddit it is written that any nvlddmkm error can be due to PSU/CPU/RAM... It's basically anything that causes you to lose communication with the GPU and you get an nvlddmkm as a result. Only this is a wrong pointer. And the fact that you don't know it only confirms the fact that you are noob.


That's how I know you understand absolutely nothing about how PC hardware works

'Even on Reddit' :tgrin: :tgrin: :tgrin::tgrin::tgrin::tgrin:

Is this a joke? You're a comedian right and this is a YouTube skit? :tgrin:
Haha lol kek u dumb lol fleeeeeeeeeeeex :lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig::lunar2019crylaughingpig:

You're really not very smart.
Hypnotickiller Apr 11, 2023 @ 12:06pm 
Originally posted by Red Star, Blood Moon:
Originally posted by MancSoulja:

Our 30 series cards are nearly 3 year old now, they're not new, we paid a good price and got 3-4 years of gaming out of them, it will a decade before they're obsolete and most of us will upgrade next gen.

No one was deceived, no one was ripped off, you're just a conspiracy nut.
You're one of the most completely uninformed soy boying simps I seen if you seriously think planned obsolesence is being "a conspiracy nut" you HAVE to be an American, or possibly a British or Canadian if I am being charitable. 3 years is ♥♥♥♥. Absolute. Freaking. ♥♥♥♥♥♥♥♥♥. Particularly for something so high costing as a 3070ti, that's a freaking $600 graphics card that should've cost $400-500 like 70 ti did before, and it still had ancient af 8gb VRAM. We had 8gb of VRAM back in 2015. That's just pathetic. No, it's not supposed to break or go bad in that little time dude. A 700 level card should be lasting you at least 6 years. The level of low expectations is truly some bug eating nonsense right here.



Originally posted by Hypnotickiller:




again, REGARDLESS of graphic settings (even set to low) uses about 5 gigs of vram, so have 3 gigs spare and still crashes. ALL MY OTHER STEAM GAMES runs perfectly fine. in my case its nothing to do with vram.
Yeah, all the other games you tried so far and that were made not recently I'm sure do work fine, for just about the same reason I can get Witcher 3 original version to work with a GTX 700 series laptop or whatever on high settings. That doesn't mean a GTX 770 is going to be able to run Cyberpunk though. I know for a fact a huge portion of your library is going to be older games so all those games are completely irrelevant. All that counts is NEWER games. And of those titles I guarantee if you tried to run even a slightly older but still new game like Doom Eternal at 1440p ultra your graphics card would start to struggle--not because the GPU die itself is too slow, but because your VRAM configuration was deliberately designed to be as corn cuttingly low end as possible while still being unnoticeable as a problem until the next gen launch. Which again, literally has been what nvidia always does and will do stretching back decades to their 1gb cards being clearly not made with the future in mind.

If someone misled you into believing you are "futureproofing" by going with nVidia then I feel sorry for you, for being that gullible, because the person was wrong or a shill. Current gen RDNA3 raytracing was always going to comparable or better than last gen cards like Ampere, so whatever advantage Ampere had for RT in the past, is erased now, and more importantly because of its limited VRAM makes it perform more poorly than RX 6000 cards because RT is VRAM intensive and so altogether it always was clear that 30 series was going to age badly. I and many others on this forum were involved in those exact tech discussions three years ago when Ampere was announced and nearing release found out about its VRAM, and in tech circles we argued about this. Some fools called it "futureproofing with RT" which I always insisted was stupid just because last gen nVidia RT was likely to be slower than RDNAx anyway, but what I personally didn't count on was that the VRAM limitation was going to be so crippling so fast that RDNA2 cards now outperform nVidia on raytracing.

The simple matter is that knowing nVidia buyers simply replace their cards every 2 years and sell it to some sucker to make the newest purchase affordable. They often sell their old card off to help pay for the new one because nVidia can be slightly faster than AMD and at times much more expensive but it ages really badly usually, and 30 series is aging legendarily badly just like we all knew it would. It will age just Kepler to the point of total irrelevancy, and people having 150 game libraries in 2016 probably could still run most of their games on GTX 780's just fine at the time, but that wouldn't have stopped the newest 2016 games running like ♥♥♥♥♥♥♥♥ on a 780's paltry 3gb VRAM when all settings are turned up. This is why people often buy Radeon GPUs instead, because Radeon often tends to be much cheaper while delivering comparable performance, but usually aging much better and that is why AMD or ATI/Radeon users typically held onto their cards for many more years without replacing it, because AMD users fully expect to get 4-6 years out of their cards at very high settings minimum depending what level.

I don't know how else to explain the facts to you dude
https://youtu.be/Rh7kFgHe21k?t=340
The midrange 30 series are absolutely pathetic at running this game and that's all down to AMD having literally double the VRAM on their comparable cards like the 6800 vs 3070ti which btw the 3070ti was an arse product anyway because at launch it already had a $100 higher MSRP than the 2070 super which a lot of people switched to AMD just because their $400 card was comparable to nVidia's $500 card with the only heavy advantage being if you wanted to RT or not. Since you could enable RT on AMD, the only reason to get a 3070ti was RT, because the 3070ti was slower than an RX 6800 overall but it did raytracing somewhat faster at launch, hence the choice was do you want really good raytracing right now and a ticking time bomb in your computer later, or do you want overall faster performance for $20 cheaper and much better longevity thanks to double the VRAM.

Again, if you want part of those tech discussions I feel sorry for you, but those are the hardware realities. Your 3070ti is going to keep aging super badly particularly at RT ultra and anything above 1080p. It's still a perfectly fine 1080p card, it just can't be doing high end raytracing or any kind of nonstuttery performance in the future thanks to having too little VRAM as it was deisgned.




are you STILL going to be ignorant to the fact my problem isnt a vram problem?
< >
Showing 16-30 of 35 comments
Per page: 1530 50

Date Posted: Apr 11, 2023 @ 10:20am
Posts: 35