Horizon Forbidden West™ Complete Edition

Horizon Forbidden West™ Complete Edition

View Stats:
jano Apr 16, 2024 @ 12:45am
How to fix FPS for RTX 3070 (temporarily) and some other GPUs with about 8 GB VRAM
When after a cutscene or entering a city or other stuf you see the FPS dropping and at the same time the VRAM usage is approaching 7.5 GB (sometimes earlier), then go to the menu (ESC) in the graphics settings change the settings to very low wait about 3 sec. then back to very high (or whatever you use). In my case it works the same as restarting the game, but I don't have to start from the last save.

This also proves that the game's program code is capable of clearing memory for itself (or whatever it's fixing then), but it doesn't do it on its own. You can also see on RTSS that GPU Memory is higher than GPU Memory/process.

I hope this makes it easier for you to play.

I myself play on very high settings 2560x1080@75 and after the "memory cleanup" described above it does not drop below 60, and in the wild it is 75 (with V-Sync on).

All those tips to disable HAGS, fullscreen optimizations, rBAR, etc. in my case decreased performance rather than fixed anything.
Last edited by jano; Apr 16, 2024 @ 12:59am
< >
Showing 61-75 of 75 comments
Zephyr Apr 18, 2024 @ 1:14pm 
Originally posted by Majestic:
Actually, the game itself is trimming memory. Because I can see it go up to 11K, when im panning the camera like mad, and running around to force as many assets/sec on the screen. But when I just stand still, i can see it drop.

So the game is culling assets from memory when it can, so it makes sense the OP can force it with this fix. I'm just saying that it shouldnt have been necessary on a GPU of his calibre, that should've been easily able to play the game. But because Nvidia is greedy, they didn't spend the 25 euro extra to put 16GB on it.

It's such a shame, and just means the 3070's will become e-waste even faster.
Yes, for me it is also going up to 11-12 GB VRAM max, but I play at 4K and very high seetings.

However I tested 1440P resolution very high and high and also there it allocates 9-10 GB or so and uses 8-9 GB/process, sometimes even more I guess.
Zephyr Apr 18, 2024 @ 1:22pm 
Originally posted by jano:
Originally posted by Zephyr:

What is the point here and there I have to agree is that the recommendation spec sheet for the game leads to false expectations. They put a RTX 3070 there for native high 1440P 60 FPS and it is not really enough anymore in the current game release, likely due to VRAM problems. It is by now the wrong positioning. Modern games often require more VRAM. RTX 3070 is NOT the sweet spot for 1440P high/ultra anymore in newer game releases. It may manage in some, but in many others there will be problems already.

My point is that rtx 3070 can run it with very high settings in 1080p with average 70 fps, but the fps drops after some time (for example to 50 fps), then changing graph. settings as is described in my first post in this discusion fix fps (Ayloy is in the very same place, becouse I press ESC and do tric). So it schould be fix. We can talk about gpus for future gaming, but it is not the point of this discusion.
Well, you use considerably more than standard 1080P resolution, which is 1080*1920 P. What you and others describe here might be some issues with VRAM not being cleared, but it is not a classical VRAM leak or something. It might be how assets are loaded in the game or whatever. The question is, if this can be fixed. It is really not ok to recommend GPUs that cannot run the game properly at advertised settings, there we can all agree I guess. Maybe you and the others will be in luck, but honestly I would not expect too much. In the best case it is a driver thing and in the worst case it is an issue that cannot be fixed (or the devs/publishers do not want to have to fix it) for 8 GB VRAM GPUs. Only time will tell.
Majestic Apr 18, 2024 @ 1:28pm 
Originally posted by Zephyr:
It is really not ok to recommend GPUs that cannot run the game properly at advertised settings, there we can all agree I guess.

I dont know, what do you think the fallout would be if they recommended a RTX 4080 to run the game properly? Developer is stuck between a rock and a hard place. With Nvidia users making up over 80% of the userbase on steam, and all of them are ignorant apparently of VRAM importance.

So what is Sony/Geurrilla/Nixxes to do? You can see the plethora of headlines already if they min. require 4080. Because 12GB isn't enough either on 4K. It's not the developers fault that AMD offers 16GB on 500MSRP cards, and Nvidia charges you 1200 for it.
Zephyr Apr 18, 2024 @ 1:43pm 
Originally posted by Majestic:
Originally posted by Zephyr:
It is really not ok to recommend GPUs that cannot run the game properly at advertised settings, there we can all agree I guess.

I dont know, what do you think the fallout would be if they recommended a RTX 4080 to run the game properly? Developer is stuck between a rock and a hard place. With Nvidia users making up over 80% of the userbase on steam, and all of them are ignorant apparently of VRAM importance.

So what is Sony/Geurrilla/Nixxes to do? You can see the plethora of headlines already if they min. require 4080. Because 12GB isn't enough either on 4K. It's not the developers fault that AMD offers 16GB on 500MSRP cards, and Nvidia charges you 1200 for it.
Yes, I thought about this also. This thought brought me to the quetsion why they did not recommend the RX 6750 12GB VRAM at 1440P high/very high. This GPU matches the RTX 3070 almost 100% in rasterization power and has "only" 12 GB VRAM. The RX 6800 they actually recommend is more powerful in rasterization AND has 16 GB VRAM. That I find really the weird part and well, I think they knew there were going to be problems with 8 GB VRAM and they might even have thought that 12 GB would cut it close, hence the RX 6800 recommendation came to be.

Sure, as a dev you are in a hard place there, but recommending something that is not going to work out as recommended is really bad practice if they truly knew and this is not simply a weird bug that can be fixed. They COULD have chosen the RTX 4070 with 12 GB VRAM or the RTX 4060 TI 16 GB VRAM instead of the RTX 3070 and underscore the VRAM :). This would have CLEARLY indicated where the VRAM journey is going and you still would have been fimrly in the same tier as the RTX 3070 or the RX 6750 XT, only one generation further (but without too many performance differences).
Last edited by Zephyr; Apr 18, 2024 @ 1:46pm
Majestic Apr 18, 2024 @ 1:49pm 
Originally posted by Zephyr:
Sure, as a dev you are in a hard place there, but recommending something that is not going to work out as recommended is really bad practice if they truly knew and this is not simply a weird bug that can be fixed. They COULD have chosen the RTX 4070 with 12 GB VRAM or the RTX 4060 TI 16 GB VRAM instead of the RTX 3070 and underscore the VRAM :). This would have CLEARLY indicated where the VRAM journey is going and you still would have been fimrly in the same tier as the RTX 3070 or the RX 6750 XT, only one generation further (but without too many performance differences).
'
Again though, people look at the GPU name, not VRAM. And recommended 4070 would also receive scorn and negative feedback on how unoptimized the game is. The problem is, it runs fine on a 3070. Up until the vram saturates.

These developers can't win with 80% of the userbase being on Nvidia, and their dodgy planned obscolescence BS hardware. It's almost a litmus test at this point who still buys their hardware.
Zephyr Apr 18, 2024 @ 2:31pm 
Originally posted by Majestic:
Originally posted by Zephyr:
Sure, as a dev you are in a hard place there, but recommending something that is not going to work out as recommended is really bad practice if they truly knew and this is not simply a weird bug that can be fixed. They COULD have chosen the RTX 4070 with 12 GB VRAM or the RTX 4060 TI 16 GB VRAM instead of the RTX 3070 and underscore the VRAM :). This would have CLEARLY indicated where the VRAM journey is going and you still would have been fimrly in the same tier as the RTX 3070 or the RX 6750 XT, only one generation further (but without too many performance differences).
'
Again though, people look at the GPU name, not VRAM. And recommended 4070 would also receive scorn and negative feedback on how unoptimized the game is. The problem is, it runs fine on a 3070. Up until the vram saturates.

These developers can't win with 80% of the userbase being on Nvidia, and their dodgy planned obscolescence BS hardware. It's almost a litmus test at this point who still buys their hardware.
Hm, I do not know... . The RTX 4060 TI 16 GB would not have hurt too much as a recommendation. This is not going to go away anymore and even if in this case it might be optimized enough to properly work with 8 GB VRAM in 1-2 years most newly released AAA games will likely just not, except for 1080P. At some point they will need to take the plunge and soon :).

But whatever. I mean more VRAM would improve a lot of things. Cyberpunk vanilla has really ugly textures but stays below 8 GB VRAM. There is a better HD textures mod and it increases VRAM need just 1-2 GB, but at higher resolutions than 1080 P you will then hit the 8 GB+ VRAM use, which is likely why in the original release there are so ugly low res textures.

Well... . Still would be good if all people could soon play this games as per recommendation. That would give a little hope for the next releases like Ghost of Tsushima.
Last edited by Zephyr; Apr 18, 2024 @ 2:32pm
Majestic Apr 18, 2024 @ 2:42pm 
Originally posted by Zephyr:
Hm, I do not know... . The RTX 4060 TI 16 GB would not have hurt too much as a recommendation.

Arguably the worst value card in recent history.
Ellis_Cake Apr 18, 2024 @ 2:52pm 
It is fine tho, since the 8gb version runs just fine.
Zephyr Apr 18, 2024 @ 3:05pm 
Originally posted by Majestic:
Originally posted by Zephyr:
Hm, I do not know... . The RTX 4060 TI 16 GB would not have hurt too much as a recommendation.

Arguably the worst value card in recent history.
It still would have been better than the RTX 3070 in the current state :). It also is within +/- 5% of the RTX 3070 in power. Compared to the RTX 3070 the value is then not really bad I think. Not that you should necessarily buy one, but nobody will buy a new RTX 3070 anymore either.
Last edited by Zephyr; Apr 18, 2024 @ 3:14pm
Disannul Apr 18, 2024 @ 4:00pm 
Originally posted by Zephyr:
Originally posted by Disannul:
Even 3080 10GB cant do great 30fps at 4k. Have to tweak settings down to high/mid mix, and likely use some sort of upscaling option to achieve good lows. 3000 series marketed as 4k capable was always partially BS by Nvidia. Sure some games can do smooth 30 or even 60fps at native 4k with 3070+ but those are not GPU intensive games. Even 4000 series with many modern titles cant do native 4k well, again need upscaling and/or frame gen. This is for regular raster as well, forget ray tracing. False advertisment + poor expectations by consumers can be seen playing out here. Can the game be optimized? Yes. Will optimizations fix 4k frame rate? No. Native 2k/1440p with 3070+ is sweet spot, like many modern gpu intensive games. Can Nix optimize direct storage? Maybe. I would be curious if they ever offer a option to turn it off, but then they would need to implement some sort of memory culling, which given whatever engine, port hacks, dev experience, and licenses they are beholden to are is unknown since we dont have that info; its a whole other rabbit hole

About 4K we do not have to discuss. Anything below a RTX 4070 TI Super or RX 7900 XT is not really worth talking about, except if you are ok with less than 60 FPS (and a RTX 3070 surely will not manage stable and nice 30 FPS here, like you said) . Also, the false marketing by Nvidia is obvious. Without DLSS or Framegen everything below a RTX 4080 falls quite short when using heavy ray tracing.

What is the point here and there I have to agree is that the recommendation spec sheet for the game leads to false expectations. They put a RTX 3070 there for native high 1440P 60 FPS and it is not really enough anymore in the current game release, likely due to VRAM problems. It is by now the wrong positioning. Modern games often require more VRAM. RTX 3070 is NOT the sweet spot for 1440P high/ultra anymore in newer game releases. It may manage in some, but in many others there will be problems already.

Ty for reminding me about the core issue; classic response, my bad for focusing on the last few posts/personal bias due to my hardware. We are also 100% talking about native resolution, no upscaling (cheating). Cant see 3070 doing native 1440p at ultra here, already need dynamic resolution scaling at 60 fps 4k with mid textures and mid/high settings with 3080 + restart after 2-3h. So another false advertisement, not from Nvidia but Nix?

Sweet spot being average. Sure some new/modern games require a demanding hardware setup to achieve Max settings 1440p 60fps raster/ray tray but if you look at all new games released past 2y then, on average, you dont need the hardware this game requires for the same fps.

This is legit issue tho. Regardless of what Nix claims, which we can agree is bs (degree of bs is another argument), HZ FB is an outlier for what is needed to get 60fps at max/high settings, with native 1440p and or 4k. Feel like marketing for gfx from involved parties (eg Nvidia and Nix in this case) is way outta hand. Game gfx goodness was always a marketing issue (hype aight), but lately i feel its even less anchored in reality

Originally posted by Majestic:
Actually, the game itself is trimming memory. Because I can see it go up to 11K, when im panning the camera like mad, and running around to force as many assets/sec on the screen. But when I just stand still, i can see it drop.

So the game is culling assets from memory when it can, so it makes sense the OP can force it with this fix. I'm just saying that it shouldnt have been necessary on a GPU of his calibre, that should've been easily able to play the game. But because Nvidia is greedy, they didn't spend the 25 euro extra to put 16GB on it.

It's such a shame, and just means the 3070's will become e-waste even faster.

I could not get this fix to work after a certain amount of time. Med texture needs reset every 2-3h depending on how much i travel, resetting to low few seconds then to med gave me another hour. Tested also with high and ultra high. Same result just not as long, it works to add time, but at some point simply doesnt work anymore. So some culling is happening, which is nice, can play a fair bit longer with the fix but it aint permanent. Very nice temp fix in any case
yamaci Apr 18, 2024 @ 9:37pm 
no need for brain gymnastics. avatar frontiers of pandora looks better than forbidden west, has more advanced real time lighting and has no issues running on 8 GB cards even at 1440p and no issues on 12 Gb cards at 4K. It streams textures properly and most of the time streams away textures that are not detrimental to the overall perceived image quality.

despite having poor performance overall, even most unreal engine 4 and 5 titles are not like this with VRAM. most UE4/UE5 titles have proper texture streamers that simply just work with 8-12 GB VRAM cards.

and Frontiers of Pandora is more of a nextgen game than horizon forbidden west ever will be (forbidden west has a lastgen version and has its design rooted in lastgen, whereas frontiers of pandora is exclusive to nextgen). alan wake 2 is another title that manages vram perfectly, and I was able to play that game at 1440p with my 3070 without any issues. I could see it streaming some lower quality tree textures in very far distances in few occasions and that was it (pandora is comparable to forbidden west and alan wake 2 is comparable to last of us part 1)

stop making excuses for developers as well. I'm not making any excuses for NVIDIA. I'm not saying there should not be compromises. nixxes ports do not even offer you a proper compromise. you can evet set textures to low and still get high pcie usage and extreme fps drops with poor %1 lows. using pcie is not a solution on PC as it is not optimized or intended to be used that way.

do whatever frontiers of pandora and alan wake 2 is doing. if not, don't blame the hardware that much. it is clear that these budgets are workable for those games. so I don't see why Nixxes has to reinvent the wheel

compromise must be simple: getting lower quality textures in far or maybe medium distances. that is it. if they cannot implement that kind of an engine level streamer, then they should not advertise 3070 as capable for 1440p/high 60.
Last edited by yamaci; Apr 18, 2024 @ 9:40pm
Majestic Apr 19, 2024 @ 12:40am 
>no need for brain gymnastics
>goes on a post-purchase rationalizing rant
>uses a Ubislop game as an example of a good game

I love the internet.
Last edited by Majestic; Apr 19, 2024 @ 12:43am
Ellis_Cake Apr 19, 2024 @ 3:14am 
Originally posted by yamaci:
no need for brain gymnastics. avatar frontiers of pandora looks better than forbidden west, has more advanced real time lighting and has no issues running on 8 GB cards even at 1440p and no issues on 12 Gb cards at 4K. It streams textures properly and most of the time streams away textures that are not detrimental to the overall perceived image quality.

despite having poor performance overall, even most unreal engine 4 and 5 titles are not like this with VRAM. most UE4/UE5 titles have proper texture streamers that simply just work with 8-12 GB VRAM cards.

and Frontiers of Pandora is more of a nextgen game than horizon forbidden west ever will be (forbidden west has a lastgen version and has its design rooted in lastgen, whereas frontiers of pandora is exclusive to nextgen). alan wake 2 is another title that manages vram perfectly, and I was able to play that game at 1440p with my 3070 without any issues. I could see it streaming some lower quality tree textures in very far distances in few occasions and that was it (pandora is comparable to forbidden west and alan wake 2 is comparable to last of us part 1)

stop making excuses for developers as well. I'm not making any excuses for NVIDIA. I'm not saying there should not be compromises. nixxes ports do not even offer you a proper compromise. you can evet set textures to low and still get high pcie usage and extreme fps drops with poor %1 lows. using pcie is not a solution on PC as it is not optimized or intended to be used that way.

do whatever frontiers of pandora and alan wake 2 is doing. if not, don't blame the hardware that much. it is clear that these budgets are workable for those games. so I don't see why Nixxes has to reinvent the wheel

compromise must be simple: getting lower quality textures in far or maybe medium distances. that is it. if they cannot implement that kind of an engine level streamer, then they should not advertise 3070 as capable for 1440p/high 60.

Yeah no need for gymnastics,
Forbidden west looks great
And can be run fine with 8gb vram.

There, even less gymnastics and excersized brevity ^^
jano Apr 19, 2024 @ 6:31am 
Originally posted by Majestic:
Actually, the game itself is trimming memory. Because I can see it go up to 11K, when im panning the camera like mad, and running around to force as many assets/sec on the screen. But when I just stand still, i can see it drop.

So the game is culling assets from memory when it can, so it makes sense the OP can force it with this fix. I'm just saying that it shouldnt have been necessary on a GPU of his calibre, that should've been easily able to play the game. But because Nvidia is greedy, they didn't spend the 25 euro extra to put 16GB on it.

It's such a shame, and just means the 3070's will become e-waste even faster.

If one were to call the rendering code a pipe, something clog that pipe slowly. It's as if, in the absence of VRAM, something is dumped into RAM, but then when you move away from the demanding location (e.g., a city), yes VRAM is cleared, but what was needed in the wilderness (some textures/whatever) doesn't go back into VRAM, but stays in RAM and is kind of used for rendering. This is just an example - I personally have no idea what's nudging this rendering pipe, apparently Nixxes employees don't either.

The bottom line: it doesn't bother me that I get 50FPS in the city or other places with lots of textures/objects. What bothers me is that when I leave the city and go to the woods, I still have 50FPS, even though earlier I was walking in the woods for a long time and had 75FPS. This is a bug. It's not about the 8 GB VRAM, because as I wrote a while ago: I was roaming in the forest for a long time and had 75 FPS and everything cleared up. I saw that Red Dead Redemption 2 on max settings with 40 FPS nr RTX 3070. This game about pandora has 60 FPS with 3070 (and indeed looks as impressive as HFW).
Last edited by jano; Apr 19, 2024 @ 6:45am
Disannul Apr 19, 2024 @ 10:49am 
Originally posted by yamaci:
stop making excuses for developers as well. I'm not making any excuses for NVIDIA. I'm not saying there should not be compromises. nixxes ports do not even offer you a proper compromise. you can evet set textures to low and still get high pcie usage and extreme fps drops with poor %1 lows. using pcie is not a solution on PC as it is not optimized or intended to be used that way.

I did not excuse Nix/Nvidia, literally did say Nix and Nvidia have done bad marketing. Its right there in my post.

What i think you are referring to about making excuses is when i wrote that Nix optimizing the game to fix said issue is potentially a no for x reasons. We really dont know enough about Nix to know. There is nothing wrong with saying that maybe they cant because of budget, personnel exp, porting issues, licensing obligations, legacy stuff, etc etc etc. Like, this applies to all big technical projects, not just in game development. If you have ever worked on or managed some very technical projects and there are always times when you wish you could do something but simply cant because of whatever limitation. The deliverable is almost always a compromise to some degree. In addition, depending on the field, you need to do full accounting and audit compliance which must be publicly available, so the "excuses" have to explained publicly. Nix is not a institution/in a field that needs to do that, so they can just say nothing if they want. So we dont know.

Since we dont know Nix internal details since they dont have to say anything, then its ok to say they cant optimize/fix potentially, even if they wish to. Or maybe they can, but they wont due to project limitations. Or maybe they are just greedy. We dont know. I wont throw too much shade without proper information. This is not making excuses to me, just acknowledging reality. I can get annoyed at Nix for bad marketing, but not for lack of VRAM issue fix. Why get emotionally invested when you dont have any info, so tiring
Last edited by Disannul; Apr 19, 2024 @ 10:50am
< >
Showing 61-75 of 75 comments
Per page: 1530 50

Date Posted: Apr 16, 2024 @ 12:45am
Posts: 75