The Last of Us™ Part I

The Last of Us™ Part I

View Stats:
This topic has been locked
NATSU Jun 1, 2023 @ 1:28am
My specs i3-12100f RX6600XT(8GB) ram 32gb can't run game
I want to play this game with the lowest settings
Please optimize the CPU (Core i3) when building shaders, as it crashes every time it reaches 3-4% after a few minutes. Is it possible to build a slower shader without it crashing?"
< >
Showing 1-15 of 22 comments
Bad 💀 Motha Jun 1, 2023 @ 1:41am 
What ever made you think any i3 was a good idea?
___ Jun 1, 2023 @ 2:22am 
bro buy at least i5
Bad 💀 Motha Jun 1, 2023 @ 3:02am 
Any 4 core 4 thread cpu is just a #Nope for today's games, cmon
Samputti Jun 1, 2023 @ 4:58am 
♥♥♥♥♥♥♥♥!! 3800x , 16gb 3600mhz Ram , Strix 6600 XT (overclocked) and 0 crashes so far Preset High 1080p settings, game runs well
Cryiox Jun 1, 2023 @ 5:03am 
Have drivers 23.5.1
Disable ReBar
Have a Page File of 64 GB set manually.

Game won't crash anymore.
DEREK Jun 1, 2023 @ 7:57am 
.....:SaladLaugh:
Cretaceous Jun 1, 2023 @ 9:08am 
it wont be able to run the game with a i3
Of course the game should run on that cpu, because it's significantely faster than my Ryzen 7 2700x and i could run it with update 1.0.3. In it's current state the game seems to be software groats.
episoder Jun 1, 2023 @ 10:25am 
Originally posted by The Mummy:
Of course the game should run on that cpu, because it's significantely faster than my Ryzen 7 2700x and i could run it with update 1.0.3. In it's current state the game seems to be software groats.

wth. the 12100f is a 4c/8t cpu. the 2700x is 8c/16t cpu. half the power to run it. and it will mosdef not even hit 60 fps.
LordDeimosIV Jun 1, 2023 @ 10:42am 
The games minimum spec CPU's are 4 cores 8 threads. No reason the game shouldn't run on an i3-12100F.

The i3-12100F can also run God of War, Forza Horzon 5, Cyberpunk 2077 etc. without issues. Devs dropped the ball.

Funny how some people in this thread see the name i3 and immediately think it's a bad CPU.
Last edited by LordDeimosIV; Jun 1, 2023 @ 10:48am
episoder Jun 1, 2023 @ 11:24am 
yes. the game will run, once the shaders are compiled. it will not run alot of framerate tho. the game engine is programmed in a way that it needs alot of cpu horsepower to prepare and render the graphics.
Last edited by episoder; Jun 1, 2023 @ 11:25am
Colosso Jun 1, 2023 @ 11:30am 
my i5 11400f 6/12 cry in some areas with this poor optimized game
flat_Lander1 Jun 1, 2023 @ 1:20pm 
Originally posted by episoder:
Originally posted by The Mummy:
Of course the game should run on that cpu, because it's significantely faster than my Ryzen 7 2700x and i could run it with update 1.0.3. In it's current state the game seems to be software groats.

wth. the 12100f is a 4c/8t cpu. the 2700x is 8c/16t cpu. half the power to run it. and it will mosdef not even hit 60 fps.
only that i3 12100 is way faster than r7 2700 in every way except work related software
episoder Jun 1, 2023 @ 1:27pm 
Originally posted by flat_Lander1:
only that i3 12100 is way faster than r7 2700 in every way except work related software

this is not about core clock speed. the game needs more then that. the engine initially ran on a ps5. those are 8 cores (or 6+ usable) with variable clock speed at a peak of 3.5 Ghz. it's using more threading than clock speed to do it's thing. and it's utilising all of it. so 4c/8t are not enough to deliver the same frame pace at similar resolution.
Last edited by episoder; Jun 1, 2023 @ 1:29pm
Originally posted by episoder:
Originally posted by flat_Lander1:
only that i3 12100 is way faster than r7 2700 in every way except work related software

this is not about core clock speed. the game needs more then that. the engine initially ran on a ps5. those are 8 cores (or 6+ usable) with variable clock speed at a peak of 3.5 Ghz. it's using more threading than clock speed to do it's thing. and it's utilising all of it. so 4c/8t are not enough to deliver the same frame pace at similar resolution.

I just wanted to say it’s refreshing to see someone mention the PS5’s CPU AND have knowledge of the fact that all 8 cores/16 thread’s aren’t available to developers (even 1st party devs, some people get confused with that). Even though it has 16GB of VRAM, only 12-13GB are available for games. Which is why I question the idea that we all are gonna need cards with 16GB or more of VRAM to be able to play PS5/Xbox Series X ports going forward into the rest of the generation. I think we are just dealing with a ♥♥♥♥ show lately with this long string of poorly optimized, half baked AAA releases on PC. (TLOU, Jedi Survivor, Hogwarts, Calisto, etc).

Edit: That said, I’m not suggesting anyone go out and buy the 4060Ti 💩, if you’re buying a card today then get 12GB-16GB if you can afford it. I’m just saying people shouldn’t be trying to throw away their 8GB RTX3000/8GB RX6000 cards either.
Last edited by Jedimindtrickonyou; Jun 1, 2023 @ 7:54pm
< >
Showing 1-15 of 22 comments
Per page: 1530 50