inZOI
Pat Apr 7 @ 12:51am
Frame Generation
ez trick to activate "Frame Generation" on any Nvidia card.

On Nvidia video cards, there is currently no way to activate AMD upscaling or frame generation in the game, but by changing two lines in the game's .ini it is possible to activate it.

===================================================================
To Activate: Go To "C:\Users\(YouUser)\AppData\Local\BlueClient\Saved\Config\Windows" and edit the "GameUserSettings.ini" search and change the lines so they look exactly like these:

FSR3Value=4 (1=UltraPerformance 2=Performance 3=balanced 4=Quality 5=UltraQuality)
FSR3FI=True (Frame Interpolation/Generation)

It works on any Nvidia video card.
===================================================================

UPDATE: as some people reported in the comments, how to enable FSR Frame Gen with DLSS... I did a test and in addition to the better image quality, I gained 10 more frames. See which of the two options is better for you.

for this, the lines should be like:
DLSSValue=4 (1=UltraPerformance 2=Performance 3=balanced 4=Quality)
FSR3FI=True (Frame Interpolation/Generation)

(remember to set FSR 3 Value=0 if you are going to use this version and also make sure that DLSSG is False! "DLSSG=False")
Last edited by Pat; Apr 7 @ 12:51am
< >
Showing 1-11 of 11 comments
But... Nvidia already has frame generation? That's what DLSS is? Or it's the other Nvidia only option, forget which is which rn. One of the in-game Nvidia options does Super-Sampling, taking a section of a frame and then generating a full frame from that, and the other one takes the previous frame and generates 2 to 3 of the next frames, depending on your specific Nvidia card (DLSS 3 or 4).

So why would I want to enable a setting for an AMD card when my Nvidia card functions perfectly fine? Especially since an Nvidia card, to my knowledge, has no way to use AMDs protocols?
Lovyan Apr 7 @ 2:07am 
Originally posted by detheagle666:
So why would I want to enable a setting for an AMD card when my Nvidia card functions perfectly fine? Especially since an Nvidia card, to my knowledge, has no way to use AMDs protocols?
For most people, you are correct, the option is of no real use whatsoever. However, there are a few cards (for example, my GTX 1660-Super getting 55+FPS on ultra at 1080p) which don't have DLSS but can use FSR to good effect.
best
thx dude now i have 60fps stable on high set on my potato 2070
FRAME GENERATION works fine in the setting with RTX cards. You do not need to edit anything. Using FSR forced through the ini will degrade the image quality


FOR RTX CARDS:

FRAME GENERATION:
IF it says Turn On (means it is active)
if it says Turn Off (means it is off)

Their wording seems reversed.

The game does not yet have VRR support Meaning you can get screen tearing with Frame Gen. Frame Gen deactivates Vsync so you need to use fast Vsync force in the control panel for nividia.

IF you try to force FSR over DLSS framegen, you will not be able to use DLSS Presets in the nvidia app. P{reset K enhances visuals as well as increases performance without the need for frame gen on decent cards, but with it you MUST USE the games options and dlss frame gen. IF you force the over ride it will reset your PRESET


FOR NON RTX CARDS:
Yes you can try using FSR and force it but it can lead to some load issues so be careful and make sure saves are regularly backed up.

Last edited by Houstonheart; Apr 7 @ 5:22am
Originally posted by Lovyan:
Originally posted by detheagle666:
So why would I want to enable a setting for an AMD card when my Nvidia card functions perfectly fine? Especially since an Nvidia card, to my knowledge, has no way to use AMDs protocols?
For most people, you are correct, the option is of no real use whatsoever. However, there are a few cards (for example, my GTX 1660-Super getting 55+FPS on ultra at 1080p) which don't have DLSS but can use FSR to good effect.

I'm genuinely surprised a GTX card can do *ANY* frame gen, that's crazy. I'm somewhat ignorant on this, so you probably know more than I do on this.

I thought only the RTX cards (20 series and above) had any kind of frame generation architecture built in? Does that mean FSR is fully software based? If you have a non-Nvidia, non-AMD card, would FSR work with those graphics cards as well?

Also, how are you getting ultra quality? My buddy has a 1660-Super, and a Ryzen 5100 (I think). He would probably be pretty interested in this game, but I didn't think he would be able to play it at all, or at least *well*, with his setup. Are you using the default Ultra preset, or is it customized?

*Edit because me dumb, can't read, and needed to reword my questions lol
Last edited by detheagle666; Apr 8 @ 3:23pm
Lovyan Apr 12 @ 3:18am 
Originally posted by detheagle666:
I'm genuinely surprised a GTX card can do *ANY* frame gen, that's crazy. I'm somewhat ignorant on this, so you probably know more than I do on this.

I thought only the RTX cards (20 series and above) had any kind of frame generation architecture built in? Does that mean FSR is fully software based? If you have a non-Nvidia, non-AMD card, would FSR work with those graphics cards as well?

Also, how are you getting ultra quality? My buddy has a 1660-Super, and a Ryzen 5100 (I think). He would probably be pretty interested in this game, but I didn't think he would be able to play it at all, or at least *well*, with his setup. Are you using the default Ultra preset, or is it customized?
Basically, the 16-series cards were released as the 'bridge' series between the GTX and RTX lines. As such, 16-series cards were built with 20-series baseline architecture, but don't have the RT or Tensor cores/circuitry of the RTX line (required for hardware support for raytracing and DLSS/FG/AI respectively). This allowed them to approach the raw power of the 20-series cards with respect to 'traditional' triangle-based rendering without leaving a gap too large for 'budget builders' to eventually catch up.

This is why the 16-series cards still hold up well for most games to this day (especially for those who are adept at building custom/overclocked systems, as the 16-series is very well suited to overclocking). Also, its 20-series-based architecture (and associated driver support) allows the limited support of certain 'RTX-only' features such as raytracing and frame generation (which I understand is kind of like a 'software-emulation version' which won't work on actual GTX cards, which strictly speaking the 16-series is not).

As for why I'm able to use 'ultra' quality (for all but DoF and motion blur, which I always have off because a. I don't really like either as an 'effect', and b. they eat GPU cycles for no visual benefit) is partially because of the above: the 16-series is basically a budget 20-series and can be tuned/overclocked to provide comparable framerates using traditional rendering. Secondly, I'm also utilising dynamic resolution, which reduces the 'rendered' screen size as required when the scene changes drastically; along with FSR this reduces the quality of the scene when moving (effectively providing 'fake motion blur' for free in the process), but when the scene is static (95% of the time) it resolves to a clear and acceptable image.

Finally, I started gaming when CGA/EGA graphics were the 'current tech' - and coding when 'full 3D at 640x400, 30FPS' was still a pipe dream for most PC owners - so I'm not put off by an 'imperfect image' or 'low FPS' the way modern gamers are. Without FSR I get a solid 35+FPS at 1080p adaptive (25+FPS non-adaptive) and that's within the minimum required for 'acceptable' framerates of scenes containing motion. Modern audiences are mostly either too young to remember, and/or so spoilt by modern 'hardware-based' graphics tech, that they forget that sometimes 'good enough to play' requires a lot less 'modern' than they think.
Last edited by Lovyan; Apr 12 @ 3:22am
I am using:

DLSSValue=4 (1=UltraPerformance 2=Performance 3=balanced 4=Quality)
FSR3FI=True (Frame Interpolation/Generation)

However, when accessing 3D printing or custom furniture features, the FPS drops to 1–5 (with an RTX 3080), making it unplayable.
Lovyan Apr 12 @ 3:51am 
Originally posted by Juan de Dios:
I am using:

DLSSValue=4 (1=UltraPerformance 2=Performance 3=balanced 4=Quality)
FSR3FI=True (Frame Interpolation/Generation)

However, when accessing 3D printing or custom furniture features, the FPS drops to 1–5 (with an RTX 3080), making it unplayable.
That's because the '3D printer/generative-AI texture' features uses the same hardware GPU cores as DLSS/framegen on true RTX cards; of course the FPS is going to tank when the AI is hogging the exact same pipeline used by the graphics output.
I'm not referring to FPS dropping when the AI is being used — that would be somewhat normal. I'm referring to the fact that when you open the interface for the '3D printer' or 'custom furniture' (before even using the AI), the FPS drops to 1–5 FPS.
With FSRFI off and only DLSS enabled, FPS drops from a stable 60 to 30... but with FSRI on, FPS drops from a stable 120 fps to 1-5 fps
Last edited by Juan de Dios; Apr 12 @ 4:50am
Lovyan Apr 12 @ 5:07am 
Originally posted by Juan de Dios:
I'm not referring to FPS dropping when the AI is being used — that would be somewhat normal. I'm referring to the fact that when you open the interface for the '3D printer' or 'custom furniture' (before even using the AI), the FPS drops to 1–5 FPS.
With FSRFI off and only DLSS enabled, FPS drops from a stable 60 to 30... but with FSRI on, FPS drops from a stable 120 fps to 1-5 fps
Ok, that does seem odd; if you haven't activated the AI, there shouldn't be any effect on framerate. As a coder, my only guess is that the UI coder(s) messed up somewhere and pre-started the AI thread pool before there are any AI jobs to work on (eating up GPU cycles literally for no good reason). Welcome to early access, I guess. :lunar2019deadpanpig:
Originally posted by Lovyan:
Basically, the 16-series cards were released as ... (etc)

Wow, I had no idea that their was a 16-series, much less that they where a pseudo-RTX card. I personally skipped over a lot of that, I went from a 1070ti laptop card straight to a 3050. Which, funnily enough, the RTX 3050 and the GTX 1660-Super have roughly similar specs, which always confused the hell out of me. I guess shame on me for not doing any research, huh?

At any rate, I worked with my buddy based on your performance settings, and we got the game to run at medium setting quite well. He hated the dynamic resolution stuff, but we managed to get about 35-45fps by turning down some settings and getting FSR working. It's not perfect, his loading times are gigantic and there's a noticeable stutter, but he seems pretty happy for all that. At least until he can afford to upgrade his rig lol.

Yeah, I'm sort of the same way with frames, I know people that say stuff like "Oh, it's so much better at 144 fps!" and all I can do is shake my head. Do I prefer 60fps? Of course. Can I tell the difference between 25, 60, and 120 fps? Nope, not usually. I can sometimes tell when there is something causing very high computational lag, like bugging physics, or if a game has to load something big from storage. My thing is, if I can watch a movie in the theaters (which are 24 point something fps), and not be able to tell the difference from a game "running" at 1,000+ fps, then either I'm broken or the marketing hype is out of control.

That said, I am certainly spoiled rotten because of modern hardware. Thinking back to trying to play Skyrim on a Vista-era Lenovo ThinkPad... The things I used to put up with that I would scream and cry about now lmao. Minimum graphics, and getting like 5-10 fps, laptop fans screaming in pain, you get the picture. I would ♥♥♥♥ bricks if that was the best I could get these days lol.
< >
Showing 1-11 of 11 comments
Per page: 1530 50