Installa Steam
Accedi
|
Lingua
简体中文 (cinese semplificato)
繁體中文 (cinese tradizionale)
日本語 (giapponese)
한국어 (coreano)
ไทย (tailandese)
Български (bulgaro)
Čeština (ceco)
Dansk (danese)
Deutsch (tedesco)
English (inglese)
Español - España (spagnolo - Spagna)
Español - Latinoamérica (spagnolo dell'America Latina)
Ελληνικά (greco)
Français (francese)
Indonesiano
Magyar (ungherese)
Nederlands (olandese)
Norsk (norvegese)
Polski (polacco)
Português (portoghese - Portogallo)
Português - Brasil (portoghese brasiliano)
Română (rumeno)
Русский (russo)
Suomi (finlandese)
Svenska (svedese)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraino)
Segnala un problema nella traduzione
For example, I would dare anyone to load up modern versions of Java Minecraft (1.18 or newer) and try the maximum render distance of 32 (or, if you're feeling brave and using OptiFine/similar, 64) and tell me it never drops frames in demanding locations/situations. Not happening. I don't care if you put a 13900KS overclocked to 10 GHz in that situation. Does that means CPUs aren't "good enough"? Not really. It just means that for Minecraft specifically, and in that circumstance, you can't attain that.
And there's always the option of, you know... turning a slider down. Most sliders are just editing a config file with numbers as far as I know, and just because they expose up to a maximum value, doesn't mean something HAS to be able to do it. In any game (if the developer allowed it), you can probably put things high enough to bring performance down on any hardware. It literally doesn't mean anything.
I understand with GPUs that a new outlier might be taken as a sign of what future games might do. But the thing is, we never really ever got to the point where 4K native performance was excessive enough, so of course when a jump in game demands comes, it won't be enough. 4K, ray tracing, and new game engine having the possibility of being demanding, even on the currently most powerful hardware? That's not too surprising to me.
And the fact a 4090 is able to mostly power through it, just shows how powerful a 4090 truly is.
You think that's bad? How about AMD's press release of the 7900XTX. Bragging the advantages of the new display port, and how it works for 8K. Yet the card struggles with 4K, and embarrassingly sucked at 8K.
If don't like that, I'd suggest again, don't max settings. Or, just don't get into the hobby.
Again not every game is the same. Just because one game is demanding doesn't mean other games are. Some games handle open world a lot better than other do.
There could be other open world games that can handle RTX on at 4K without a problem.
This Fortnite game isn't the end all and be all of all test. It could very well be un-optimized.
I said it a couple of times already but you don't HAVE to max everything (it's often silly to). They are settings for a reason. The fact that you can find/create a situation where performance can be brought down below whatever predefined value you're using doesn't mean anything. Settings are just setting values for a given thing. "Draw this thing this far". "Use this texture". "Light this using this method". Etc. That's all the settings define. You can often set values above what the actual settings expose in many games. The fact that the topmost possibility can bring performance way down (often for diminished visual returns) means nothing more than the fact that it's not worth it with the limited hardware power we have right now, and not necessarily the same as "things aren't good enough". I mean maybe in a way it is, after all, the attempt is to progress, but it takes time, and is slowing down. We don't have infinite performance, and limits will be reached. This is the reality, so we have to live with these facts.
And justifying that the most expensive or fastest can't even do something is hardly surprising. "Fastest" is an arbitrary term and nothing more. It still has a finite level of performance. In recent times, resolution and other increased demands combined have simply grown at a rate that outpaced performance gains. It's why things like upscaling and "fake frames" are getting a lot of development and press time. So pointing to an example where the latest game engine CAN bring performance below 120 FPS at 4K is... not surprising to me... at all. One extreme scenario doesn't speak to the capabilities of the entire thing.
On 4K resolution, you can max out most of the settings, enable Ray Tracing and HDR, with DDLS 3.0 supported games (including Fortnite) and get around 120 FPS.
Without DDLS 3.0, it still runs higher than 60 FPS.
I don't think that person knows how to use the settings best for his graphics card.
https://www.youtube.com/watch?v=i5WuzatxKzM
ps: That video is showing the RTX 3080, not RTX 4090 (which can gets way more higher), so yeah... (just not with Ray Tracing enabled)
https://www.youtube.com/watch?v=MjvCnDNh8Wc
This is probably a better video of the performance, with everything maxed out (including RayTracing enabled) upon a RTX 4090. Averages around 89 FPS. If you had optimized game settings, it would be way more.
You can get better graphics with better performance, due to AI frame generation.
If you consider real-time AI frame generation as "fake", then disable real-time Ray Tracing and use the actual "fake" preshaders, that you ironically don't consider as faked in games... you will still get double the FPS performance than older generations.
Path tracing lighting versus "pre-configured" lighting via shaders using shadow masks have different hardware requirements for their different visual results. It's a balancing act. Most people, if everything else was equal, would probably take better visuals, but for some situations (namely, for performance reasons), many will be "fine" with no ray tracing methods.
DLSS 3.0 (not DLSS broadly, but DLSS 3 specifically) and similar have their aspects that warrant criticism IMO. To my understanding, namely, it can have some issues with input when the "actual" frame rates are relatively low. But regardless, not everyone is a fan, and that's fine.
Also, you don't even need DLSS 3 to use path tracing/RTX anyway so I'm not sure why you're saying it's only one or the other. Sure, obviously DLSS3 brings higher frame rates but that doesn't make it mandatory to use for RTX.
If you're trying to posit that being okay with no path tracing at all but not being okay with DLSS 3 (and similar) is a hypocritical position, it's not.