Questa discussione è stata chiusa
RTX 4090 is not for Native 4K gaming with the latest game engine
with a huge price tag of 3000$ (if youre lucky 2000$ + tax 2400$ Approximately) almost for a card its only able to barely touch 4K 60 fps on the Latest Game engine UNREAL ENGINE 5.1 ? so is it safe to assume ALL gpu's are designed to run games at 1440p and we should all be OK paying 3000$ just to be satisfied at 1080p and 1440p ?? it was not too long ago i remember a GTX 1080 TI was able to run games at 4k 60 yes it was based on games from that generation but shouldnt a GPU thats released now be able to run all games at atleast 4k 120 fps with the latest game engine IF its the top end GPU of current generation??

Sure if you use old generation engines it does reach that fps but what about the future games using the latest game engine ?? are we expected to pay 3000$ every year for the next gpu ??

Here is the benchmark and keep in mind i am only referring to NATIVE 4K not DLSS : https://www.youtube.com/watch?v=dr7LpP7Dm4E


Update January 27 2023 : Newly released game Forespoken running at 4K Native at 43 FPS its not even been a year since 4090 release and this is its performance : https://www.youtube.com/watch?v=U0u9l4Wkh9s

Update February 07 2023

Newly released game as of the date hogwarts legacy doesnt even get 60 fps : https://www.youtube.com/watch?v=5dKUpcMckBg

Lets make sure to blame every single game developer from all different companies and pretend the RTX 4090 is actually a 4k card and all developers are doing everything wrong from all studios and all companies. Its DEFINITELY NOT NVIDIA pretending and Lying publicly about RTX 4090 being a true "Native 4k" GPU.

Lets all keep pretending.
Ultima modifica da 8========~D; 7 feb 2023, ore 15:16
< >
Visualizzazione di 46-60 commenti su 616
I think it's fair to point out that using a particular game(s) isn't a good broad marker for claims like this. The software world is widely variable. Many extreme s exist.

For example, I would dare anyone to load up modern versions of Java Minecraft (1.18 or newer) and try the maximum render distance of 32 (or, if you're feeling brave and using OptiFine/similar, 64) and tell me it never drops frames in demanding locations/situations. Not happening. I don't care if you put a 13900KS overclocked to 10 GHz in that situation. Does that means CPUs aren't "good enough"? Not really. It just means that for Minecraft specifically, and in that circumstance, you can't attain that.

And there's always the option of, you know... turning a slider down. Most sliders are just editing a config file with numbers as far as I know, and just because they expose up to a maximum value, doesn't mean something HAS to be able to do it. In any game (if the developer allowed it), you can probably put things high enough to bring performance down on any hardware. It literally doesn't mean anything.

I understand with GPUs that a new outlier might be taken as a sign of what future games might do. But the thing is, we never really ever got to the point where 4K native performance was excessive enough, so of course when a jump in game demands comes, it won't be enough. 4K, ray tracing, and new game engine having the possibility of being demanding, even on the currently most powerful hardware? That's not too surprising to me.
last i checked a 4080 was 1600-1800 dollars and your asking why you need a better card as time goes on ? maybe PC is not for you.maybe at some point we'll go backwards but i wouldnt count on it.
Ultima modifica da Guydodge; 21 gen 2023, ore 16:30
Messaggio originale di Guydodge:
last i checked a 4080 was 1600-1800 dollars and your asking why you need a better card as time goes on ? maybe PC is not for you.maybe at some point we'll go backwards but i wouldnt count on it.
i typed 4090 NOT 4080 and youre typing price before tax as if you live in a country with no tax which you end up paying anyways. And 4090 doesnt go for 16-1800$ its 2400$ card that cant handle unreal engine 5.1 at 4k 120 fps end of story . You can keep lying though even though proof is posted
Messaggio originale di Komarimaru:
Messaggio originale di 8==========D:
I expect a pc built now with the most expensive hardware to hold up 2-3 years atleast with highest settings at highest resolution with optimal high fps on a high refresh rate monitor
Hate to break this to you, but that's never been the case. Actually used to be worse, new tech coming out every one to two years.
So i should spend 6000$ - 8000$ every year to play any game at bare minimum performance at high settings and i should absolutely NOT expect it to run all games at high settings and fps for 2-3 years at least for that price point
Messaggio originale di Zefar:
Messaggio originale di 8==========D:
It can’t run it at 120 4K on unreal engine 5 regardless … top of the line specs

No, it can't run Fortnite at 4K with max RTX settings.

There are other Unreal engine games that are far less demanding which will be able to run at 120 4K without any problems.

Just because Fortnite can't be run at absolute max settings in 4K doesn't mean it's the same for every other game out there.
thats kind of what im trying to point out. Yes that particular game is "demanding" BUT the question is if theres any card in the planet that CAN run a demanding game like that game with those settings should it NOT be that card which costs the most based on retail consumer market ? like should i NOT expect my top of the line GPU and CPU to be able to handle a "demanding" game ??
Messaggio originale di Hardʬare Hero:
"RTX 4090 is not for Native 4K gaming with the latest game engine"

I actually wasn't aware of this at first, then saw footage later of rtx 4090 just tanking when native resolution is enabled, and I was unpleasantly surprised that dlss still needs to be used.

Not only that but new versions of dlss are being developed and implemented, telling you that native 4k is a long way off. We will always rely on savvy ai upscaling and never have native sharpness.

For this price tag the resolution should be native. This is PC, not console. A console is weaker and has a use for "fake" 4k. This is PC! We want real 4k! We have access to powerful hardware that goes beyond console so it's time to ditch the console upscaling.

This is just a huge marking scam trying to get the public to buy into their ai upscaling when really the hardware they release isn't good enough.
thats exactly the point . Because im ready to spend but I expect the most expensive consumer GPU and most expensive CPU with most expensive ram and most expensive motherboard to handle a demanding game at NATIVE 4K 120 fps it still doesnt do that.... what am i paying for then ?? 1440p ? fun fact a RTX 2070 super does that already...
Messaggio originale di 8==========D:
Messaggio originale di Komarimaru:
Hate to break this to you, but that's never been the case. Actually used to be worse, new tech coming out every one to two years.
So i should spend 6000$ - 8000$ every year to play any game at bare minimum performance at high settings and i should absolutely NOT expect it to run all games at high settings and fps for 2-3 years at least for that price point
Or, just don't max every setting if you can't afford the hobby to do so. High End PC Gaming is a very very expensive hobby. Add onto the fact they are running Lumen, which is Unreal 5's version of ray tracing ALONG with normal ray tracing. It's called showing off what the engine can do, nothing more. A tech show.

And the fact a 4090 is able to mostly power through it, just shows how powerful a 4090 truly is.

You think that's bad? How about AMD's press release of the 7900XTX. Bragging the advantages of the new display port, and how it works for 8K. Yet the card struggles with 4K, and embarrassingly sucked at 8K.
Messaggio originale di Komarimaru:
Messaggio originale di 8==========D:
So i should spend 6000$ - 8000$ every year to play any game at bare minimum performance at high settings and i should absolutely NOT expect it to run all games at high settings and fps for 2-3 years at least for that price point
Or, just don't max every setting if you can't afford the hobby to do so. High End PC Gaming is a very very expensive hobby. Add onto the fact they are running Lumen, which is Unreal 5's version of ray tracing ALONG with normal ray tracing. It's called showing off what the engine can do, nothing more. A tech show.

And the fact a 4090 is able to mostly power through it, just shows how powerful a 4090 truly is.

You think that's bad? How about AMD's press release of the 7900XTX. Bragging the advantages of the new display port, and how it works for 8K. Yet the card struggles with 4K, and embarrassingly sucked at 8K.
if im going out to buy Top of the line everything literally everything i expect it to run at 4k 120 fps atleast on the latest game engine.
Messaggio originale di 8==========D:
Messaggio originale di Komarimaru:
Or, just don't max every setting if you can't afford the hobby to do so. High End PC Gaming is a very very expensive hobby. Add onto the fact they are running Lumen, which is Unreal 5's version of ray tracing ALONG with normal ray tracing. It's called showing off what the engine can do, nothing more. A tech show.

And the fact a 4090 is able to mostly power through it, just shows how powerful a 4090 truly is.

You think that's bad? How about AMD's press release of the 7900XTX. Bragging the advantages of the new display port, and how it works for 8K. Yet the card struggles with 4K, and embarrassingly sucked at 8K.
if im going out to buy Top of the line everything literally everything i expect it to run at 4k 120 fps atleast on the latest game engine.
Welcome to PC gaming. There will always be a setting that cripples performance. MSAA being one that killed FPS before, dropping it below 1/4 performance if ran 8xMSAA.

If don't like that, I'd suggest again, don't max settings. Or, just don't get into the hobby.
Messaggio originale di 8==========D:
thats kind of what im trying to point out. Yes that particular game is "demanding" BUT the question is if theres any card in the planet that CAN run a demanding game like that game with those settings should it NOT be that card which costs the most based on retail consumer market ? like should i NOT expect my top of the line GPU and CPU to be able to handle a "demanding" game ??

Again not every game is the same. Just because one game is demanding doesn't mean other games are. Some games handle open world a lot better than other do.

There could be other open world games that can handle RTX on at 4K without a problem.

This Fortnite game isn't the end all and be all of all test. It could very well be un-optimized.
As I have noticed unreal engine 5 just released early last year. So you can expect glitches/bugs in the game engine until it gets further updates.
Messaggio originale di Komarimaru:
Or, just don't max every setting...
Messaggio originale di Komarimaru:
Welcome to PC gaming. There will always be a setting that cripples performance.
This, so many times.

I said it a couple of times already but you don't HAVE to max everything (it's often silly to). They are settings for a reason. The fact that you can find/create a situation where performance can be brought down below whatever predefined value you're using doesn't mean anything. Settings are just setting values for a given thing. "Draw this thing this far". "Use this texture". "Light this using this method". Etc. That's all the settings define. You can often set values above what the actual settings expose in many games. The fact that the topmost possibility can bring performance way down (often for diminished visual returns) means nothing more than the fact that it's not worth it with the limited hardware power we have right now, and not necessarily the same as "things aren't good enough". I mean maybe in a way it is, after all, the attempt is to progress, but it takes time, and is slowing down. We don't have infinite performance, and limits will be reached. This is the reality, so we have to live with these facts.

And justifying that the most expensive or fastest can't even do something is hardly surprising. "Fastest" is an arbitrary term and nothing more. It still has a finite level of performance. In recent times, resolution and other increased demands combined have simply grown at a rate that outpaced performance gains. It's why things like upscaling and "fake frames" are getting a lot of development and press time. So pointing to an example where the latest game engine CAN bring performance below 120 FPS at 4K is... not surprising to me... at all. One extreme scenario doesn't speak to the capabilities of the entire thing.
Ultima modifica da Illusion of Progress; 22 gen 2023, ore 10:49
With the RTX 4090, it has DDLS 3.0.

On 4K resolution, you can max out most of the settings, enable Ray Tracing and HDR, with DDLS 3.0 supported games (including Fortnite) and get around 120 FPS.

Without DDLS 3.0, it still runs higher than 60 FPS.

I don't think that person knows how to use the settings best for his graphics card.

https://www.youtube.com/watch?v=i5WuzatxKzM

ps: That video is showing the RTX 3080, not RTX 4090 (which can gets way more higher), so yeah... (just not with Ray Tracing enabled)

https://www.youtube.com/watch?v=MjvCnDNh8Wc

This is probably a better video of the performance, with everything maxed out (including RayTracing enabled) upon a RTX 4090. Averages around 89 FPS. If you had optimized game settings, it would be way more.

You can get better graphics with better performance, due to AI frame generation.

If you consider real-time AI frame generation as "fake", then disable real-time Ray Tracing and use the actual "fake" preshaders, that you ironically don't consider as faked in games... you will still get double the FPS performance than older generations.
Ultima modifica da Azza ☠; 22 gen 2023, ore 11:21
Hardware and software develop independently of each other. Also, the floor is made of floor.
Messaggio originale di Azza ☠:
If you consider real-time AI frame generation as "fake", then disable real-time Ray Tracing and use the actual "fake" preshaders, that you ironically don't consider as faked in games...
Hold on here...

Path tracing lighting versus "pre-configured" lighting via shaders using shadow masks have different hardware requirements for their different visual results. It's a balancing act. Most people, if everything else was equal, would probably take better visuals, but for some situations (namely, for performance reasons), many will be "fine" with no ray tracing methods.

DLSS 3.0 (not DLSS broadly, but DLSS 3 specifically) and similar have their aspects that warrant criticism IMO. To my understanding, namely, it can have some issues with input when the "actual" frame rates are relatively low. But regardless, not everyone is a fan, and that's fine.

Also, you don't even need DLSS 3 to use path tracing/RTX anyway so I'm not sure why you're saying it's only one or the other. Sure, obviously DLSS3 brings higher frame rates but that doesn't make it mandatory to use for RTX.

If you're trying to posit that being okay with no path tracing at all but not being okay with DLSS 3 (and similar) is a hypocritical position, it's not.
Ultima modifica da Illusion of Progress; 22 gen 2023, ore 12:17
< >
Visualizzazione di 46-60 commenti su 616
Per pagina: 1530 50

Data di pubblicazione: 20 gen 2023, ore 15:33
Messaggi: 616