FINAL FANTASY XVI

FINAL FANTASY XVI

View Stats:
arlesgiovanny Oct 3, 2024 @ 7:24am
terrible optimization.
the game have a terrible optimization, framerate drops many times, and the game is unplayale because the framerate.
< >
Showing 31-45 of 83 comments
StingingVelvet Oct 5, 2024 @ 5:16pm 
Originally posted by Fulgerion:
That is because it is created for consoles. And optimized specifically for the PS5.

It didn't run well on PS5 either, tons of tech channels and users complained about it. The game just isn't well optimized in general.
PoetsReap Oct 5, 2024 @ 6:12pm 
I'm almost 50 hours in and now I'm crashing in a cutscene repeatedly. I could deal with the frame drops and lag, but currently it seems I won't be able to continue playing the game....
MageTea Oct 5, 2024 @ 6:59pm 
I got about 45 hours into the game while dealing with a micro-stutter every 5-10 seconds. In the process of trying to resolve that I updated the drivers for all my components to the latest versions. After all those updates my game now crashes after only a few minutes of gameplay. I also am still experiencing the stutter through all of it.
iameatingjam Oct 5, 2024 @ 7:08pm 
Originally posted by PoetsReap:
I'm almost 50 hours in and now I'm crashing in a cutscene repeatedly. I could deal with the frame drops and lag, but currently it seems I won't be able to continue playing the game....

The one after bahamut? I hear that a lot. Though, once again, it was fine for me.
Xengre Oct 5, 2024 @ 7:54pm 
Originally posted by alumlovescake:
Originally posted by Xengre:
Originally posted by alumlovescake:
Watch digital foundry's video

the game actually runs that badly
Actually, they didn't say "the game runs that badly".

They said they have no idea why the freak it is using FSR 1 which is making it so blurry and bad looking when FSR 2 is available. They actually have made more recent videos specifically targeting how FSR 1 is harming consoles because, for some bizarre reason, devs keep using FSR1 instead of newer versions on consoles.

I never said Digital Foundry said that.
Thats what I said
What? This is literally what you are saying.

You're saying, in my quoted post of your comment, to watch Digital Foundry's video and then immediately state "the game runs that badly". Your post, literally, has nothing else between those two statements. This means you're implying, very directly, that watching Digital Foundry's video validates that the game "runs that badly". There is no other way to interpret that. Even worse, this is the opposite of what Digital Foundry actually stated...



Originally posted by alumlovescake:
I said DF video proves that game is 720p sub 60fps what is true. as they literately state the resolution and show gameplay with a fps counter
No, this is not what you said in that post. It was an extension of your prior post, yes, but the point you made there was that it ran "that badly" comprehensively so. The prior post was merely used as a proof of point, and it was a wrong one as well since it doesn't run at 720p 40 FPS... It runs at a higher resolution when hitting 40 FPS.

It actually hits 60 FPS when running at 720p, per Digital Foundry's investigation, and it would probably run higher but they cap it at 60 FPS. This only occurs in combat to target 60 FPS by dynamically dropping framerate and it often is higher than 720p even in combat. Outside of combat it runs much higher...

Per Digital Foundry:
Quality mode is typically 1080p to 1440p, upscaled to 4K, while performance mode seems to top out at 1080p but often drops lower.

...but the moment you start combat, the frame-rate rockets up to a near-perfect 60fps.

It turns out that when the developers mentioned they were targeting 60fps in battle only, they meant exactly that. The second you initiate combat, the internal resolution drops like a stone, hitting as low as 720p in the process.

They took issue with the developer using FSR1 instead of the much improved FSR2 as well as performance mode's overall performance behaving oddly outside of combat when exploring the world in contrast to quality mode despite performance targeting noticeably lower quality than quality mode.



Originally posted by alumlovescake:
I don't care what DF say.
A bit hypocritical no?
Originally posted by alumlovescake:
Watch digital foundry's video

the game actually runs that badly

So you like to use them as a reference when convenient, but when pointed out it isn't convenient they're suddenly irrelevant? Especially so when you incorrectly quoted them as evidence of a point expecting no one to fact check that the evidence actually ran completely contrary to your claim? Weird.


Originally posted by alumlovescake:
1080p30 and 720p40 is not acceptable not with graphics that are outclassed by last gen games
Again, this information is presented inaccurately. See above points.


Originally posted by alumlovescake:
DF usually are truthful, trustworthy and unbias but their FF16 videos are just them shilling
The hole gets deeper and deeper.


Originally posted by alumlovescake:
In other videos DF showcase games with better graphics and resolution and DF slander the game for dropping a few frames.
While DF has its problems at times this isn't relevant to this discussion. You only need to present statements based on the evidence presented for this specific situation. You can either point out where it is / isn't valid on a factual level or not use it at all. You're not doing that here, though. I, however, am.

Originally posted by alumlovescake:
The only thing in the FF16 that should be taken into the account is the facts they state what are the resolution and frames what are both laughably bad.
Nope. If you rad the article instead of incorrectly citing it as proof of something they never stated you might know that there is more relevant context than simply "resolution and FPS". This is even more important when said resolution and FPS figures you are giving aren't even correct to begin with... It is clear that performance mode isn't functioning correctly compared to quality mode when outside of combat and their dynamic resolution feature needed improvement (in fact it has been patched multiple times since release, including on PC already, for not working ideally). Most importantly, their core issue was with FSR1 instead of FSR2 which isn't even an optimization issue and just a bizarre choice from the developer (many developers, actually, as Square Enix isn't the only one doing this on console oddly) and that would have fixed both image quality and performance situation quite a bit for the performance mode (and also improved quality even further).



As for other comments I've seen on this thread and in general about how "even others with RTX 4090s are getting horrible low framerates" here is my own results in two of the widely cited worst areas in the game, Lostwing & Hideout.
https://youtu.be/wK9l1G-dwEE
StingingVelvet Oct 5, 2024 @ 9:22pm 
Originally posted by Xengre:
As for other comments I've seen on this thread and in general about how "even others with RTX 4090s are getting horrible low framerates" here is my own results in two of the widely cited worst areas in the game, Lostwing & Hideout.
https://youtu.be/wK9l1G-dwEE

You have DLSS Quality and Frame Gen on, which means you're actually getting 1440p and like 60-80fps on a 4090, which isn't good at all. That's way worse than games like Cyberpunk with RT on, which is nuts.
Xengre Oct 6, 2024 @ 12:10am 
Originally posted by StingingVelvet:
Originally posted by Xengre:
As for other comments I've seen on this thread and in general about how "even others with RTX 4090s are getting horrible low framerates" here is my own results in two of the widely cited worst areas in the game, Lostwing & Hideout.
https://youtu.be/wK9l1G-dwEE

You have DLSS Quality and Frame Gen on, which means you're actually getting 1440p and like 60-80fps on a 4090, which isn't good at all. That's way worse than games like Cyberpunk with RT on, which is nuts.
Sorry to break it to you but that isn't right.

Having tested both here are the results:

Final Fantasy XVI
57-70 FPS in Hideout Native 4K
41-56 FPS in Lostwing Native 4K
You have the rest in my video above involving DLSS Quality + Frame Generation

Cyberpunk 2077 performance with RTX 4090
32 - 37 FPS 4K Native (path tracing)
68 - 82 FPS 4K Native (no ray tracing at all)
68 - 80 FPS 4K DLSS Quality Frame Generation (path tracing)

so not quite (as in not even close) what you thought compared to FFXVI...

Your argument is "but Cyberpunk 2077 has ray/path tracing so of course it is more demanding!" but the reality is...
https://www.youtube.com/watch?v=6emAxMeqZHU
Yes, without path tracing or ray tracing is looks not so great. Not bad, sure, but no where near as impressive falling flat and despite taking a more photorealistic art style (heavy emphasis but not pure photorealistic) compared to Final Fantasy XVI's intentional illustration not photorealistic art style... Cyberpunk 2077 no longer looks that great compared to XVI yet offers similar performance depending on where you are in the two games.

It should also be pointed out that whlie Cyberpunk 2077 does look good it is a combination of the overall package. Individual assets like character models/textures are not very good... Not to mention their facial animations and full body animations absolutely pale compared to FFXVI's:

I'll post a few examples.
https://www.youtube.com/watch?v=23pBIiGGJNw
https://www.youtube.com/watch?v=j1DUgBPgUI8
https://www.youtube.com/watch?v=QzaFSEctZIs

Also general textures low / med / high (more about this later and a key point of what you're missing)
https://www.youtube.com/watch?v=slKfRmVQ2Yc


Now check out the character animations here, especially the facial animations for Cid and Benedikta which are, frankly, some of (arguably THE) best currently available in any game at the moment (I couldn't find a quick video at this point that wasn't playing at some lower settings, but this isn't meant to show graphics in this one but facial animations and despite being 1080p I think it is still clear enough image quality to see)
https://youtu.be/fOWJl9qEovA?si=mNKD0uctu62Sr_Fi&t=2020
Check starting at 33:40 for their interaction preceding the fight scene. Go back to Cyberpunk 2077's above now and, yeah, they're not in the same league.

Some more gameplay to show off FFXVI that isn't the bleak drab start of the game flashback portion
https://www.youtube.com/watch?v=mDCWz2YPvCk


Remember, this game is not intended to be a photorealistic render style. It is more realistic than some of the prior entries, but with a mix and heavy emphasis prioritizing an illustration style. Thus it doesn't "look" at immediately superb compared to some more photorealistic games because of its aesthetic. It isn't because it isn't a graphically impressive on a technical level.

Back to the texture quality issue I mentioned before. Notice how the steps between medium and high aren't that significant for Cyberpunk 2077 above? This is how graphics scaling works as there are severe diminishing returns the higher your resolution, texture quality, poly count, etc. go. In fact, this is a rather popular way to save performance by cutting down some notably inefficient high quality options like shadow (as just one of a few popular examples) to normal and yet retain nearly as good quality. Usually, texture quality and lighting has the biggest overall quality impact compared to most other settings despite the fact other settings can be heavy.


Now, if you want to try to fix your performance further look into SpecialK's discord (FFXVI channel) and try out Kaldaien's newer updates (his username is CatGPT on Discord). He is the creator of SpecialK and I heard he had implemented DirectStorage related elements to SpecialK and had my suspicions about DirectStorage being a contributing factor the the performance some have seen here. From what I just read looking it up a moment ago he seems to be of the same mind and is still investigating it but has made some tweaks dramatically improving results in some instances so it may be worth checking out.

Not many games use DirectStorage on PC (just Forspoken, did amazing with it... Ratchet and Clank which did so awful with it it is a joke and they still haven't fixed it since release, Final Fantasy XVI...) and this was something Square Enix considered important to support the game on PC due to how IO intensive the game was on PS5 and how Yoshi/team kept praising how much the PS5's IO capabilities contributed to supporting the high quality assets and complex scenes in this game on the PS5. Unfortunately, it seems a trend so far that developers are struggling to implement it using GPU decompression properly on PC (or perhaps this is actually a Microsoft issue..., hard to say at the current). In case you were wondering why Forspoken does well... it uses CPU decompression. Further, Kaldaien's tweaks I mentioned involve forcing CPU decompression when FF XVI defaults to GPU decompression and, as he put it, makes loading "instant" and fixes performance issues. He was tweaking it though to see how much it could be improved while maintaining stability it seemed. The situation regarding DirectStorage, unfortunately, isn't as simple as "needs optimization". It is is more complex than that, not to mention a relatively new untested tech in complex games.

Back to the issue of textures and quality configurations. The game's highest settings really aren't meant for anything but the most powerful hardware. It is the same situation with Wukong. These are essentially inefficient future proofed maxed settings. Don't expect to play with them at native, or at the very least not on anything but fairly high end hardware at a performance compromise. This is because, as I pointed out before, as quality improves the performance degradation vs quality improvement are not equal linear scaling. You can test this by lowering the settings a bit rather than using max. Most important is really to use a higher resolution, 4K preferably, but otherwise 1440p even if it means sacrificing on a number of other settings. It makes the biggest impact on image quality here while trying to maintain performance. Fortunately, upscaling does amazing jobs here (at least DLSS in my experience). Though there have been issues with upscaling for some and Square Enix has released patches, but there could also be related driver issues... (I'm using a driver pre-dating any FFXVI profiles actually and had no upscaling quality / performance issues or deactivating bugs). There isn't an excuse not to use DLSS, particularly set to quality, if you can set it to 4K and get decent frame rates. Frame gen can be a more complex pro/con situation, though. Thus when you're comparing this game pushing such settings and still trying to run it at native without upscaling this is more of a user issue than an optimization problem at that point. Further, it doesn't make for proper valid comparisons such as your Cyberpunk 2077 comparison, especially when that game has a ton of low quality assets for characters/environment but can get by with it based on good art direction, understanding the power of not being super close to assets most of the time (distance element hiding blemishes), quality lighting/RT/PT, etc.
Last edited by Xengre; Oct 6, 2024 @ 12:19am
iameatingjam Oct 6, 2024 @ 12:51am 
Originally posted by Xengre:
Originally posted by StingingVelvet:

You have DLSS Quality and Frame Gen on, which means you're actually getting 1440p and like 60-80fps on a 4090, which isn't good at all. That's way worse than games like Cyberpunk with RT on, which is nuts.
Sorry to break it to you but that isn't right.

Having tested both here are the results:

Final Fantasy XVI
57-70 FPS in Hideout Native 4K
41-56 FPS in Lostwing Native 4K
You have the rest in my video above involving DLSS Quality + Frame Generation

Cyberpunk 2077 performance with RTX 4090
32 - 37 FPS 4K Native (path tracing)
68 - 82 FPS 4K Native (no ray tracing at all)
68 - 80 FPS 4K DLSS Quality Frame Generation (path tracing)

so not quite (as in not even close) what you thought compared to FFXVI...

Your argument is "but Cyberpunk 2077 has ray/path tracing so of course it is more demanding!" but the reality is...
https://www.youtube.com/watch?v=6emAxMeqZHU
Yes, without path tracing or ray tracing is looks not so great. Not bad, sure, but no where near as impressive falling flat and despite taking a more photorealistic art style (heavy emphasis but not pure photorealistic) compared to Final Fantasy XVI's intentional illustration not photorealistic art style... Cyberpunk 2077 no longer looks that great compared to XVI yet offers similar performance depending on where you are in the two games.

It should also be pointed out that whlie Cyberpunk 2077 does look good it is a combination of the overall package. Individual assets like character models/textures are not very good... Not to mention their facial animations and full body animations absolutely pale compared to FFXVI's:

I'll post a few examples.
https://www.youtube.com/watch?v=23pBIiGGJNw
https://www.youtube.com/watch?v=j1DUgBPgUI8
https://www.youtube.com/watch?v=QzaFSEctZIs

Also general textures low / med / high (more about this later and a key point of what you're missing)
https://www.youtube.com/watch?v=slKfRmVQ2Yc


Now check out the character animations here, especially the facial animations for Cid and Benedikta which are, frankly, some of (arguably THE) best currently available in any game at the moment (I couldn't find a quick video at this point that wasn't playing at some lower settings, but this isn't meant to show graphics in this one but facial animations and despite being 1080p I think it is still clear enough image quality to see)
https://youtu.be/fOWJl9qEovA?si=mNKD0uctu62Sr_Fi&t=2020
Check starting at 33:40 for their interaction preceding the fight scene. Go back to Cyberpunk 2077's above now and, yeah, they're not in the same league.

Some more gameplay to show off FFXVI that isn't the bleak drab start of the game flashback portion
https://www.youtube.com/watch?v=mDCWz2YPvCk


Remember, this game is not intended to be a photorealistic render style. It is more realistic than some of the prior entries, but with a mix and heavy emphasis prioritizing an illustration style. Thus it doesn't "look" at immediately superb compared to some more photorealistic games because of its aesthetic. It isn't because it isn't a graphically impressive on a technical level.

Back to the texture quality issue I mentioned before. Notice how the steps between medium and high aren't that significant for Cyberpunk 2077 above? This is how graphics scaling works as there are severe diminishing returns the higher your resolution, texture quality, poly count, etc. go. In fact, this is a rather popular way to save performance by cutting down some notably inefficient high quality options like shadow (as just one of a few popular examples) to normal and yet retain nearly as good quality. Usually, texture quality and lighting has the biggest overall quality impact compared to most other settings despite the fact other settings can be heavy.


Now, if you want to try to fix your performance further look into SpecialK's discord (FFXVI channel) and try out Kaldaien's newer updates (his username is CatGPT on Discord). He is the creator of SpecialK and I heard he had implemented DirectStorage related elements to SpecialK and had my suspicions about DirectStorage being a contributing factor the the performance some have seen here. From what I just read looking it up a moment ago he seems to be of the same mind and is still investigating it but has made some tweaks dramatically improving results in some instances so it may be worth checking out.

Not many games use DirectStorage on PC (just Forspoken, did amazing with it... Ratchet and Clank which did so awful with it it is a joke and they still haven't fixed it since release, Final Fantasy XVI...) and this was something Square Enix considered important to support the game on PC due to how IO intensive the game was on PS5 and how Yoshi/team kept praising how much the PS5's IO capabilities contributed to supporting the high quality assets and complex scenes in this game on the PS5. Unfortunately, it seems a trend so far that developers are struggling to implement it using GPU decompression properly on PC (or perhaps this is actually a Microsoft issue..., hard to say at the current). In case you were wondering why Forspoken does well... it uses CPU decompression. Further, Kaldaien's tweaks I mentioned involve forcing CPU decompression when FF XVI defaults to GPU decompression and, as he put it, makes loading "instant" and fixes performance issues. He was tweaking it though to see how much it could be improved while maintaining stability it seemed. The situation regarding DirectStorage, unfortunately, isn't as simple as "needs optimization". It is is more complex than that, not to mention a relatively new untested tech in complex games.

Back to the issue of textures and quality configurations. The game's highest settings really aren't meant for anything but the most powerful hardware. It is the same situation with Wukong. These are essentially inefficient future proofed maxed settings. Don't expect to play with them at native, or at the very least not on anything but fairly high end hardware at a performance compromise. This is because, as I pointed out before, as quality improves the performance degradation vs quality improvement are not equal linear scaling. You can test this by lowering the settings a bit rather than using max. Most important is really to use a higher resolution, 4K preferably, but otherwise 1440p even if it means sacrificing on a number of other settings. It makes the biggest impact on image quality here while trying to maintain performance. Fortunately, upscaling does amazing jobs here (at least DLSS in my experience). Though there have been issues with upscaling for some and Square Enix has released patches, but there could also be related driver issues... (I'm using a driver pre-dating any FFXVI profiles actually and had no upscaling quality / performance issues or deactivating bugs). There isn't an excuse not to use DLSS, particularly set to quality, if you can set it to 4K and get decent frame rates. Frame gen can be a more complex pro/con situation, though. Thus when you're comparing this game pushing such settings and still trying to run it at native without upscaling this is more of a user issue than an optimization problem at that point. Further, it doesn't make for proper valid comparisons such as your Cyberpunk 2077 comparison, especially when that game has a ton of low quality assets for characters/environment but can get by with it based on good art direction, understanding the power of not being super close to assets most of the time (distance element hiding blemishes), quality lighting/RT/PT, etc.

Thanks for going into so much detail :steamthumbsup:

I also have a 4090 and have been playing without issue at 4k60 with a little dlss. I've been feeling like there's some factor affecting gameplay as reports vary so drastically even from the same hardware. Perhaps it is directstorage. I know at least that GPUs for whatever reason having lower pcie bandwidth ( like being in a chipset/olden gen slot) causes MAJOR slowdown.
Last edited by iameatingjam; Oct 6, 2024 @ 12:56am
StingingVelvet Oct 6, 2024 @ 12:03pm 
Originally posted by Xengre:
orry to break it to you but that isn't right.

Having tested both here are the results:

Final Fantasy XVI
57-70 FPS in Hideout Native 4K
41-56 FPS in Lostwing Native 4K
You have the rest in my video above involving DLSS Quality + Frame Generation

Cyberpunk 2077 performance with RTX 4090
32 - 37 FPS 4K Native (path tracing)
68 - 82 FPS 4K Native (no ray tracing at all)
68 - 80 FPS 4K DLSS Quality Frame Generation (path tracing)

Didn't mean path tracing, but whatever you think of my comparison the point is that 40fps at 4k with a 4090 is ridiculously low performance for a game that doesn't even have RT at all.
Xengre Oct 6, 2024 @ 1:29pm 
Originally posted by StingingVelvet:
Originally posted by Xengre:
orry to break it to you but that isn't right.

Having tested both here are the results:

Final Fantasy XVI
57-70 FPS in Hideout Native 4K
41-56 FPS in Lostwing Native 4K
You have the rest in my video above involving DLSS Quality + Frame Generation

Cyberpunk 2077 performance with RTX 4090
32 - 37 FPS 4K Native (path tracing)
68 - 82 FPS 4K Native (no ray tracing at all)
68 - 80 FPS 4K DLSS Quality Frame Generation (path tracing)

Didn't mean path tracing, but whatever you think of my comparison the point is that 40fps at 4k with a 4090 is ridiculously low performance for a game that doesn't even have RT at all.
No, because it isn't 40 FPS because due to it pushing the RTX 4090 to the max. Rather, it is hitting a CPU bottleneck first thus the full power of the RTX 4090 isn't even being used as it typically has 76% - 88% GPU usage (rarely might spike to like 90-98% for a brief split second at random) in Lostwing at native 4K max settings.

You forget that one of the core reasons Frame Generation was developed was to alleviate the problems caused by a CPU bottleneck not allowing your frame rates to reach higher and thus smoother on screen motion. Frame generation assists with precisely this as one of its core benefits.

Now weaker GPUs may hit a frame cap but they're weaker and running at settings they can't support in a heavy area along with the GPU decompression, which is the other point you are neglecting that I pointed out. The game employs GPU decompression by default which is causing the heavier GPU usage and for some users more notable FPS drops/stuttering. You can follow the prior mentioned tips to try CPU decompression with the mod, but be warned you do need a powerful CPU otherwise your CPU will cry... blood. Further, I haven't seen him mention if SpecialK update is using all the threads in the same way Forspoken was or if it is currently thread limited (Forspoken was able to use all 32 threads of my CPU, which is a pretty rare feat as most professional applications can't do this much less games due to thread-limited workload optimizations). The better the CPU the better the results, depending on how well his update works now as I haven't tested it (I only briefly checked to begin with due to seeing another online comment using SpecialK to see if it was using CPU or GPU decompression and after realizing that I went to see if the creator had similar suspicions about DirectStorage I did and investigated at all, and he had preliminary investigations).

Last, you're still ignoring what I've pointed out about the max settings for some reason when this is a crucial detail because it, single-handedly, invalidates any complaints you are presenting.

As a very direct example Wukong has Cinematic Settings as the max setting possible. However, no other UE5 game (as far as I know) has this setting, maxing at Epic. Why? Because Cinematic is normally hidden, used only for cutscenes (if even enabled for that in a game to begin with). The developers allowed this hidden demanding setting for Wukong to be revealed and you can actually force it in other games, too. However, expect lower performance and negligible image quality gains because as I've stated there are stark diminishing returns to image quality vs performance in a non-linear scaling fashion as quality improves. This is why many recommend to use the setting just below Cinematic for Wukong, and for some settings using a setting 1 down from even that is often almost as good with a performance uplift.

Another classic example, that is very obvious in games and one I also mentioned prior... is the popular recommendation to decrease shadow quality which tends to have a noticeable performance hit at max setting but often reduced down as much as two levels to normal (in many games) appears nearly as good as the max setting at a much reduced performance hit.

In fact, we've also seen indirect evidence of this and the state of the game's optimization because the people running the game at the lower minimum area of the game's specification requirements are able to still run the game fairly well as it scales rather well across configurations, bar bugs / driver issues / etc. which are not an issue of optimization.

Now for a direct example from Final Fantasy XVI, itself, using high quality screencaps with a slider to 1:1 compare image quality at Low / Medium / High / Ultra https://www.techpowerup.com/review/final-fantasy-xvi-fps-performance-benchmark/4.html

Notice anything? Perhaps that the impact isn't as OMG staggering as you might have expected? The biggest difference tends to be the aforementioned shadows and some finer minor details. Yet the performance difference can be quite significant. Simply put, this game's Ultra setting is only meant for particularly powerful computers and next-gen hardware if you refuse to use upscaling/frame generation technologies, and frankly there is NO viable excuse not to use them in modern games with how good they've gotten bar some exceptions. One exception is latency, but most games aren't latency intensive enough not to warrant using them, and many that are can be improved by simply getting high enough frame rate and/or Nvidia Reflext / AMD Anti-Lag. Last, the very few that are not simply don't use Frame Generation but DO use upscaling and, if necessary, aim for performance or ultra performance upscaling setting to further reduce latency dramatically over native latency. The other reason to not use it is in rare cases games implement it poorly, or a driver bug, and it causes notable visual bugs/degradation. FSR suffers from this more frequently than the others (but not always). This game also had some hiccups for some, but not all (unclear why at current as sometimes it was just DLSS where FSR worked for some, while others reported blurry issues with all three upscalers, and some of us had no issues at all) and Square Enix has offered some bug fixes to this issue (not sure if issue persists or not for anyone since then).

The short end is most issues are user end problems, either because they're refusing to use settings they should due to inadequate understanding of what they're doing and the results they should be realistically expecting... or bugs/other factors (game related or possibly on their end like driver version/Windows update/resizeable BAR/etc.), and last the impact of DirectStorage which is very difficult to profile for anyone but the developer and is still a very new technology that may not be where it needs to be with regards to GPU decompression (hard to say, again, for several reasons).
UnholyDentist Oct 6, 2024 @ 1:42pm 
Originally posted by Xengre:
Originally posted by StingingVelvet:

You have DLSS Quality and Frame Gen on, which means you're actually getting 1440p and like 60-80fps on a 4090, which isn't good at all. That's way worse than games like Cyberpunk with RT on, which is nuts.
Sorry to break it to you but that isn't right.

Having tested both here are the results:

Final Fantasy XVI
57-70 FPS in Hideout Native 4K
41-56 FPS in Lostwing Native 4K
You have the rest in my video above involving DLSS Quality + Frame Generation

Cyberpunk 2077 performance with RTX 4090
32 - 37 FPS 4K Native (path tracing)
68 - 82 FPS 4K Native (no ray tracing at all)
68 - 80 FPS 4K DLSS Quality Frame Generation (path tracing)

so not quite (as in not even close) what you thought compared to FFXVI...

Your argument is "but Cyberpunk 2077 has ray/path tracing so of course it is more demanding!" but the reality is...
https://www.youtube.com/watch?v=6emAxMeqZHU
Yes, without path tracing or ray tracing is looks not so great. Not bad, sure, but no where near as impressive falling flat and despite taking a more photorealistic art style (heavy emphasis but not pure photorealistic) compared to Final Fantasy XVI's intentional illustration not photorealistic art style... Cyberpunk 2077 no longer looks that great compared to XVI yet offers similar performance depending on where you are in the two games.

It should also be pointed out that whlie Cyberpunk 2077 does look good it is a combination of the overall package. Individual assets like character models/textures are not very good... Not to mention their facial animations and full body animations absolutely pale compared to FFXVI's:

I'll post a few examples.
https://www.youtube.com/watch?v=23pBIiGGJNw
https://www.youtube.com/watch?v=j1DUgBPgUI8
https://www.youtube.com/watch?v=QzaFSEctZIs

Also general textures low / med / high (more about this later and a key point of what you're missing)
https://www.youtube.com/watch?v=slKfRmVQ2Yc


Now check out the character animations here, especially the facial animations for Cid and Benedikta which are, frankly, some of (arguably THE) best currently available in any game at the moment (I couldn't find a quick video at this point that wasn't playing at some lower settings, but this isn't meant to show graphics in this one but facial animations and despite being 1080p I think it is still clear enough image quality to see)
https://youtu.be/fOWJl9qEovA?si=mNKD0uctu62Sr_Fi&t=2020
Check starting at 33:40 for their interaction preceding the fight scene. Go back to Cyberpunk 2077's above now and, yeah, they're not in the same league.

Some more gameplay to show off FFXVI that isn't the bleak drab start of the game flashback portion
https://www.youtube.com/watch?v=mDCWz2YPvCk


Remember, this game is not intended to be a photorealistic render style. It is more realistic than some of the prior entries, but with a mix and heavy emphasis prioritizing an illustration style. Thus it doesn't "look" at immediately superb compared to some more photorealistic games because of its aesthetic. It isn't because it isn't a graphically impressive on a technical level.

Back to the texture quality issue I mentioned before. Notice how the steps between medium and high aren't that significant for Cyberpunk 2077 above? This is how graphics scaling works as there are severe diminishing returns the higher your resolution, texture quality, poly count, etc. go. In fact, this is a rather popular way to save performance by cutting down some notably inefficient high quality options like shadow (as just one of a few popular examples) to normal and yet retain nearly as good quality. Usually, texture quality and lighting has the biggest overall quality impact compared to most other settings despite the fact other settings can be heavy.


Now, if you want to try to fix your performance further look into SpecialK's discord (FFXVI channel) and try out Kaldaien's newer updates (his username is CatGPT on Discord). He is the creator of SpecialK and I heard he had implemented DirectStorage related elements to SpecialK and had my suspicions about DirectStorage being a contributing factor the the performance some have seen here. From what I just read looking it up a moment ago he seems to be of the same mind and is still investigating it but has made some tweaks dramatically improving results in some instances so it may be worth checking out.

Not many games use DirectStorage on PC (just Forspoken, did amazing with it... Ratchet and Clank which did so awful with it it is a joke and they still haven't fixed it since release, Final Fantasy XVI...) and this was something Square Enix considered important to support the game on PC due to how IO intensive the game was on PS5 and how Yoshi/team kept praising how much the PS5's IO capabilities contributed to supporting the high quality assets and complex scenes in this game on the PS5. Unfortunately, it seems a trend so far that developers are struggling to implement it using GPU decompression properly on PC (or perhaps this is actually a Microsoft issue..., hard to say at the current). In case you were wondering why Forspoken does well... it uses CPU decompression. Further, Kaldaien's tweaks I mentioned involve forcing CPU decompression when FF XVI defaults to GPU decompression and, as he put it, makes loading "instant" and fixes performance issues. He was tweaking it though to see how much it could be improved while maintaining stability it seemed. The situation regarding DirectStorage, unfortunately, isn't as simple as "needs optimization". It is is more complex than that, not to mention a relatively new untested tech in complex games.

Back to the issue of textures and quality configurations. The game's highest settings really aren't meant for anything but the most powerful hardware. It is the same situation with Wukong. These are essentially inefficient future proofed maxed settings. Don't expect to play with them at native, or at the very least not on anything but fairly high end hardware at a performance compromise. This is because, as I pointed out before, as quality improves the performance degradation vs quality improvement are not equal linear scaling. You can test this by lowering the settings a bit rather than using max. Most important is really to use a higher resolution, 4K preferably, but otherwise 1440p even if it means sacrificing on a number of other settings. It makes the biggest impact on image quality here while trying to maintain performance. Fortunately, upscaling does amazing jobs here (at least DLSS in my experience). Though there have been issues with upscaling for some and Square Enix has released patches, but there could also be related driver issues... (I'm using a driver pre-dating any FFXVI profiles actually and had no upscaling quality / performance issues or deactivating bugs). There isn't an excuse not to use DLSS, particularly set to quality, if you can set it to 4K and get decent frame rates. Frame gen can be a more complex pro/con situation, though. Thus when you're comparing this game pushing such settings and still trying to run it at native without upscaling this is more of a user issue than an optimization problem at that point. Further, it doesn't make for proper valid comparisons such as your Cyberpunk 2077 comparison, especially when that game has a ton of low quality assets for characters/environment but can get by with it based on good art direction, understanding the power of not being super close to assets most of the time (distance element hiding blemishes), quality lighting/RT/PT, etc.
Nice comparisons and explanation, bit too long. To be honest I not much of a graphics zoomer, but I admit the way some games are represented in the past couple of years, are so great that it is actually the single aspect of looking at the screen becomes entertaining. In case of FFXVI, I am in awe how well the characters and the environment looks. Some cutscenes I would like to watch as a movie.
StingingVelvet Oct 6, 2024 @ 2:52pm 
Originally posted by Xengre:
No, because it isn't 40 FPS because due to it pushing the RTX 4090 to the max. Rather, it is hitting a CPU bottleneck first thus the full power of the RTX 4090 isn't even being used as it typically has 76% - 88% GPU usage (rarely might spike to like 90-98% for a brief split second at random) in Lostwing at native 4K max settings.

It's not consistenty maxing your GPU because of optimization issue, it's not a CPU limitation in Lostwing giving you 40fps. Lower your resolution enough and you'll get 120fps native in Lostwing.
Xengre Oct 6, 2024 @ 8:50pm 
Originally posted by StingingVelvet:
Originally posted by Xengre:
No, because it isn't 40 FPS because due to it pushing the RTX 4090 to the max. Rather, it is hitting a CPU bottleneck first thus the full power of the RTX 4090 isn't even being used as it typically has 76% - 88% GPU usage (rarely might spike to like 90-98% for a brief split second at random) in Lostwing at native 4K max settings.

It's not consistenty maxing your GPU because of optimization issue, it's not a CPU limitation in Lostwing giving you 40fps. Lower your resolution enough and you'll get 120fps native in Lostwing.
No you will not. I know because I already tested that. It is clear you did not test that.

In fact, I have the video from when I tested it but I'm too lazy to encode, login, and upload another but I can tell you the results for Lostwing were 1080p with DLSS Ultra Performance (frame generation was off) which means internal resolution is actually even lower than 1080p.

I got a grand total of 84-91 FPS most of the time with slight dips below and above on occasion.

My RTX 4090 was still running 52-60% becuase my CPU (Ryzen 9 3950X) capped out, again, at the exact same utilization as before, ~50% approximately with various slight dips below and sometimes a little above (but way less frequent). This is slightly worse than my frame generation 4K results notice? This is because of what I mentioned to you before... about frame generation helping minimizing the impact of CPU bottlenecks in the process of improving FPS as one of its goals.

This is because there are limits to how much you can thread game code for most engines. This is why most games do not significantly use more than 4-6 threads nowadays, with the rest often completely empty or heavily reduced load due to lack of ability to efficiently thread. Sometimes more advanced engines can push 12 threads like Frostbite due to especially advanced physics threading and optimizations, but this is generally rare.

Even most video editing programs cannot use more than 8 threads, typically.

The only way I would push to 120 FPS in Lostwing is to use a more powerful CPU, preferably ones focused on stronger individual thread performance and reduced core/thread count rather than my older 16 core / 32 thread CPU which this, and zero other games aside from Forspoken (and literally only Forspoken, due to its CPU decompression via DirectStorage), do not even fully use half of (nor even a third, really).
Last edited by Xengre; Oct 6, 2024 @ 9:48pm
LL Cheese Z Oct 6, 2024 @ 9:43pm 
The irony of people saying "fix your pc" likely being those who know nothing about PC hardware and optimization.

The game is ♥♥♥♥ for optimization, accept it and stop defending Square. Acting like FF16 is the only demanding game PC players have ever played, FOH.
Shadowlet Oct 6, 2024 @ 9:59pm 
Originally posted by LL Cheese Z:
The irony of people saying "fix your pc" likely being those who know nothing about PC hardware and optimization.

The game is ♥♥♥♥ for optimization, accept it and stop defending Square. Acting like FF16 is the only demanding game PC players have ever played, FOH.

Did you try downloading more VRAM?
< >
Showing 31-45 of 83 comments
Per page: 1530 50