Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
The developers should do everything they can first before so that all can profit from it and not just people with the needed gpu for this kind of stuff.
While DLSS3 adds input lag, nVidia have a anti lag system to completely remove the additional input lag. Basically you will have much more frames, but the same ammount of input lag than DLSS Frame Generation OFF.
So you will have a MUCH smooth transition between frames, improving by A LOT the visual experience, but input lag will not be improved. But will not be descreased either.
You need screenshots to notice any artifacts and it's not common to have.
DLSS Frame Generation can improve the experience.
Without FG you generate and show A1, generate and show B1. With FG you show A1, generate B1, fake-generate and show A2, and now wait before showing B1. You are one real frame behind and that's a theoretical minimum assuming no vsync and no processing cost at all for the frame generation. With vsync enabled the latency can turn out to be much much worse if you have lower base fps than the vsync frequency, worse the lower the fps bellow that threshold.
So no, 30fps is a very very bad idea for frame generation if you care about latency, which you may not but the OP seemly does. With a 60Hz monitor you want very close or higher than 60fps before the frame generation. That's it, assuming vsync. If you disable it then the latency won't be as bad (as stated before only around one frame behind under ideal conditions) but you'll have tearing, which is also worse the lower the framerate.
Regarding graphical artifacts they happen all the time with frame generation. It's just that with high framerates they're usually barely noticeable, but because there is always going to be artifacts the lower the framerate the obvious they're going to be. With a low base fps the difference between frames is going to be larger so the artifacts are likely to be more serious AND THEN they're going to stay for longer because each frame stays for longer.
To be fair, not being a fast paced game should help, but still is a 3D game where you can quickly manipulate the camera orientation (vs a fixed isometric for example).
Some of things you said works only for FSR3, that is a ♥♥♥♥♥♥ copycat from DLSS3.
I have 100% sure you don't have a RTX4000. All of you said just doesn't apply to DLSS Frame Generation.
Clearly you're talking things that you only read (the talk about artifacts ring the bell for me, but not only t hat). And just like DLSS2 at the begin, 99% that is on internet is pure bs from people hating nVidia without any reason to do it.
60FPS rendered by engine > 60FPS with DLSS3 >>>>>>>>>>>> 30FPS rendered by engine (Differently from ♥♥♥♥♥♥ FSR3, that is just terrible as if we were playing at 30FPS and needs at least 60FPS rendered on engine to start not looking like ♥♥♥♥)
At least on Alan Wake 2, only game that i've tested on DLSS3 until now (still don't have time to tested others).
More importantly: Cities Skylines it's not like other games. Input lag is ABSOLUTELY irreleven on this game. So YES, DLSS3 will help to feels more smoothly, which is the only thing that matters here.
No, I'm talking about DLSS3 with 4XXX tech:
https://www.youtube.com/watch?v=92ZqYaPXxas&t=1832s
And that's not the only analysis where you can find that kind of thing.
It may be irrelevant for you but it isn't for the OP.
I already explain why there's virtually no input lag added with DLSS3.
Again, you don't have a RTX4000 and it's clear to me. You're just like all the guys just copying what they heard about DLSS2 at the launch. 99% bs from people that hate nVidia just because they're wokes. I'm not saying you are, but you're talking about something you certainly never used.
You are calculating logics here, not graphics. And logics cannot "somehow" double framerate. Since it´s your processor which is maxed out, not your GPU.
ONLY with GPU being maxed out, you can with this technology double the fps that way. But with a cpu-load you cannot magically double your fps. I mean: How would that even work?
If that work, you could also double your fps in Windows 11 and get hgiher performance as Windows 10 that way. But: This is reality and that´s not how it works.
So what this means is: You can only use this tech for shooters or other graphics-intensive games.
But this game is maxing out your processor for logic, not maxing out your GPU for graphics. I mean this game doesn´t have so good graphics to begin with.
I doubt any patch will ever get this game to run fast enough.
No. You haven't explained why, you just said that it doesn't and mentioned some vague magic NVIDIA tech while I explained why: because you have to render at the very least one frame ahead in order to interpolate. You cannot get around that.
You haven't explained anything. I have said why and posted a very respectable source about latency with 4XXX NVIDIA card with REFLEX enabled supporting my claim.
????. Man, you're making a movie in your head about I don't know what. This is fairly simple: the OP cares about latency, frame generation with a 4xxx card from NVIDIA and REFLEX enabled does add input lag, AT THE VERY LEAST one frame under ideal conditions, and can be much more in a real scenario when vsync is enabled.
First, CS2 is EXTREMELY graphics-heavy, even though it is due to very bad optimization.
Second, this is about taking two frames and then generate one in-between with info from the game engine about motion vectors for better results. And all is the work is done by the GPU (if that's what you're referring to), so in principle it can work both for GPU and CPU bound games. The game logic is irrelevant here.
GPU doesn't have to be maxed out, it can work for CPU limited games too. And it's not "magically". It does what many TVs do since many many years, which is interpolating frames, but with a more advanced method for much better results.
You can use it for any compatible game. B
Good graphics or not this game is extremely GPU demanding. Frame generation sounds like a good idea, and it might be, but the technology has downsides. You need high frame rates to begin with for good results.
That´s your problem. Your GPU isn´t being maxed out in this game since you cannot max it out if your cpu cannot handle with it.
And no, you cannot simply use this tech in all games since it would make no difference when you don´t get more performance from it, because the game isn´t suitable for this. You don´t understand how this works. You can ONLY use this, when your GPU handles (like you said) HIGH framerates. so you knew about this?
But did you ever guess why that is?
Since high FPS means => high latency on the GPU, but LOW latency on the cpu.
With low fps you also have low latency. But this will get you HIGHER latency on the cpu.
Quite interesting, but that just is how it works on PC.
If you would run this game at 8k (theoretical), it would maybe run quite a bit faster because your cpu-latency gets lower then. But the problem is: You don´t have enough Ram/GPU-ram to do this, to render in such high resolutions. Thus it´s not possible.
You have to run this game at what resolution? 1080p?
And that´s the probelm with GPUs:
the LOWER your resolution is, the BIGGER your latency is.
And this is exactly why you cannot use this here. Since it´s not magic. It´s just a trick.
Thus i said you cannot use it. It won´t give you a single frame more then.
Unity wants Highend-Hardware. And AMD/INTEL isn´t meant with that.
And this game here isn´t the first to run this poorly on such type of hardware. Worldwar Z has the exact same problem. Trying to run that many Zombies at once results in a fps-desaster...
It's you who very clearly doesn't understand. If the game supports it then sure frame generation can be used. And sure it would make a big difference. The issue is the added latency and graphical artifacts, but it will/would run much smoother.
The frame generation from NVIDIA doesn't care about whether the game is cpu or gpu limited or the game logic. It just needs the frame data and then motion vector data as well to optimize the frame generation, and that vector data requires support from the game. That's the main difference with "dumb" motion interpolation methods.
AMD has two different versions. One of them similar to the NVIDIA (slightly inferior but with better hardware compatibility) and then another at the driver level (dumb) which doesn't require any explicit game support but it's not nearly as good as the two other because it doesn't use vector data to generate the interpolated frames.
You can use frame generation whether you have low or high framerates to begin with, it works. It's just that the downsides are much more evident when you start with low fps and may not worth it, but it should work. And if you don't care about the latency then definitely you'll be able to use it.
What the hell are you talking about?.
You definitely don't have a clue.