Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
So why would I want to enable a setting for an AMD card when my Nvidia card functions perfectly fine? Especially since an Nvidia card, to my knowledge, has no way to use AMDs protocols?
thx dude now i have 60fps stable on high set on my potato 2070
FOR RTX CARDS:
FRAME GENERATION:
IF it says Turn On (means it is active)
if it says Turn Off (means it is off)
Their wording seems reversed.
The game does not yet have VRR support Meaning you can get screen tearing with Frame Gen. Frame Gen deactivates Vsync so you need to use fast Vsync force in the control panel for nividia.
IF you try to force FSR over DLSS framegen, you will not be able to use DLSS Presets in the nvidia app. P{reset K enhances visuals as well as increases performance without the need for frame gen on decent cards, but with it you MUST USE the games options and dlss frame gen. IF you force the over ride it will reset your PRESET
FOR NON RTX CARDS:
Yes you can try using FSR and force it but it can lead to some load issues so be careful and make sure saves are regularly backed up.
I'm genuinely surprised a GTX card can do *ANY* frame gen, that's crazy. I'm somewhat ignorant on this, so you probably know more than I do on this.
I thought only the RTX cards (20 series and above) had any kind of frame generation architecture built in? Does that mean FSR is fully software based? If you have a non-Nvidia, non-AMD card, would FSR work with those graphics cards as well?
Also, how are you getting ultra quality? My buddy has a 1660-Super, and a Ryzen 5100 (I think). He would probably be pretty interested in this game, but I didn't think he would be able to play it at all, or at least *well*, with his setup. Are you using the default Ultra preset, or is it customized?
*Edit because me dumb, can't read, and needed to reword my questions lol
This is why the 16-series cards still hold up well for most games to this day (especially for those who are adept at building custom/overclocked systems, as the 16-series is very well suited to overclocking). Also, its 20-series-based architecture (and associated driver support) allows the limited support of certain 'RTX-only' features such as raytracing and frame generation (which I understand is kind of like a 'software-emulation version' which won't work on actual GTX cards, which strictly speaking the 16-series is not).
As for why I'm able to use 'ultra' quality (for all but DoF and motion blur, which I always have off because a. I don't really like either as an 'effect', and b. they eat GPU cycles for no visual benefit) is partially because of the above: the 16-series is basically a budget 20-series and can be tuned/overclocked to provide comparable framerates using traditional rendering. Secondly, I'm also utilising dynamic resolution, which reduces the 'rendered' screen size as required when the scene changes drastically; along with FSR this reduces the quality of the scene when moving (effectively providing 'fake motion blur' for free in the process), but when the scene is static (95% of the time) it resolves to a clear and acceptable image.
Finally, I started gaming when CGA/EGA graphics were the 'current tech' - and coding when 'full 3D at 640x400, 30FPS' was still a pipe dream for most PC owners - so I'm not put off by an 'imperfect image' or 'low FPS' the way modern gamers are. Without FSR I get a solid 35+FPS at 1080p adaptive (25+FPS non-adaptive) and that's within the minimum required for 'acceptable' framerates of scenes containing motion. Modern audiences are mostly either too young to remember, and/or so spoilt by modern 'hardware-based' graphics tech, that they forget that sometimes 'good enough to play' requires a lot less 'modern' than they think.
DLSSValue=4 (1=UltraPerformance 2=Performance 3=balanced 4=Quality)
FSR3FI=True (Frame Interpolation/Generation)
However, when accessing 3D printing or custom furniture features, the FPS drops to 1–5 (with an RTX 3080), making it unplayable.
With FSRFI off and only DLSS enabled, FPS drops from a stable 60 to 30... but with FSRI on, FPS drops from a stable 120 fps to 1-5 fps
Wow, I had no idea that their was a 16-series, much less that they where a pseudo-RTX card. I personally skipped over a lot of that, I went from a 1070ti laptop card straight to a 3050. Which, funnily enough, the RTX 3050 and the GTX 1660-Super have roughly similar specs, which always confused the hell out of me. I guess shame on me for not doing any research, huh?
At any rate, I worked with my buddy based on your performance settings, and we got the game to run at medium setting quite well. He hated the dynamic resolution stuff, but we managed to get about 35-45fps by turning down some settings and getting FSR working. It's not perfect, his loading times are gigantic and there's a noticeable stutter, but he seems pretty happy for all that. At least until he can afford to upgrade his rig lol.
Yeah, I'm sort of the same way with frames, I know people that say stuff like "Oh, it's so much better at 144 fps!" and all I can do is shake my head. Do I prefer 60fps? Of course. Can I tell the difference between 25, 60, and 120 fps? Nope, not usually. I can sometimes tell when there is something causing very high computational lag, like bugging physics, or if a game has to load something big from storage. My thing is, if I can watch a movie in the theaters (which are 24 point something fps), and not be able to tell the difference from a game "running" at 1,000+ fps, then either I'm broken or the marketing hype is out of control.
That said, I am certainly spoiled rotten because of modern hardware. Thinking back to trying to play Skyrim on a Vista-era Lenovo ThinkPad... The things I used to put up with that I would scream and cry about now lmao. Minimum graphics, and getting like 5-10 fps, laptop fans screaming in pain, you get the picture. I would ♥♥♥♥ bricks if that was the best I could get these days lol.