Topik ini telah dikunci
RTX 4090 is not for Native 4K gaming with the latest game engine
with a huge price tag of 3000$ (if youre lucky 2000$ + tax 2400$ Approximately) almost for a card its only able to barely touch 4K 60 fps on the Latest Game engine UNREAL ENGINE 5.1 ? so is it safe to assume ALL gpu's are designed to run games at 1440p and we should all be OK paying 3000$ just to be satisfied at 1080p and 1440p ?? it was not too long ago i remember a GTX 1080 TI was able to run games at 4k 60 yes it was based on games from that generation but shouldnt a GPU thats released now be able to run all games at atleast 4k 120 fps with the latest game engine IF its the top end GPU of current generation??

Sure if you use old generation engines it does reach that fps but what about the future games using the latest game engine ?? are we expected to pay 3000$ every year for the next gpu ??

Here is the benchmark and keep in mind i am only referring to NATIVE 4K not DLSS : https://www.youtube.com/watch?v=dr7LpP7Dm4E


Update January 27 2023 : Newly released game Forespoken running at 4K Native at 43 FPS its not even been a year since 4090 release and this is its performance : https://www.youtube.com/watch?v=U0u9l4Wkh9s

Update February 07 2023

Newly released game as of the date hogwarts legacy doesnt even get 60 fps : https://www.youtube.com/watch?v=5dKUpcMckBg

Lets make sure to blame every single game developer from all different companies and pretend the RTX 4090 is actually a 4k card and all developers are doing everything wrong from all studios and all companies. Its DEFINITELY NOT NVIDIA pretending and Lying publicly about RTX 4090 being a true "Native 4k" GPU.

Lets all keep pretending.
Terakhir diedit oleh 8========~D; 7 Feb 2023 @ 3:16pm
< >
Menampilkan 91-105 dari 616 komentar
SNAKE 23 Jan 2023 @ 2:10pm 
thanks for that
Azza ☠ 23 Jan 2023 @ 2:19pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh Azza ☠:

They are complaining about the AI frame generation technology used to be able to run 4K at 120Hz smoothly without any stutter or issues, even with ray tracing enabled and it doing it all in real-time.

Some would rather stick to using the faked old school shadows and shaders, rather than real path and ray tracing. Probably should also stay away from 4K resolution for the moment too.

ps: For CS:GO, I would likely get 560 FPS up to 1000+ FPS on the RTX 4090 at 4K resolution (if not capping the FPS to a max limit)... Rainbow Six Siege on Ultra Quality 4K gets between 336 min to an average of 383 FPS. Why would you want to play crappy old games like that on 4K, no idea.
dont take the persons comment seriously my current system already generates 400+ fps on csgo on 1440p . my concern is Never CSGO that game doesnt need anything to run at high fps my real concern is actual ARK 2 which is supposed to come out and run on Unreal Engine 5 and after watching RTX 4090 on Fortnite unreal engine 5 gameplay it convinced me this new generation of build will NOT get me 120 fps on 4k native despite me going for that exact build . No one builds PC for 1 game . Ark is something i struggle to play at a good high fps with high settings and that was my main goal. The reality is With a new game engine and high end systems it can NOT run the game at 4k 120 fps Native.

Yeah alright.

I will tell you what I know...

RTX 4090 is babe
Unreal 5.1 Engine is babe
Quixel Ultra-Realistic Megascans is babe

https://www.youtube.com/watch?v=U7FupEUw5q0

https://www.youtube.com/watch?v=Wj48Yqnm0ZM

https://www.youtube.com/watch?v=ca2ME4Wy0eM

DLSS 3 gives you the support needed to run 4K + HDR + Ray Tracing extremely smoothly and I have yet to find a problem or graphical glitch with it.

I don't have any issues with the RTX 4090 being able to drive 50-60 FPS Native and 90-120 FPS or higher with DLSS 3, when the graphics quality is all that eye candy.

Plus the game development of all those would also be much faster or given time to be well polished.

For Ark 2...

https://store.steampowered.com/app/2050420/ARK_2/

Notice how the video trailer says "real-time footage" *hint* *hint*

You will have photorealistic quality at 4K with real-time ambient lighting & shadows. Do you want your cake and eat it too?
Terakhir diedit oleh Azza ☠; 23 Jan 2023 @ 2:33pm
8========~D 23 Jan 2023 @ 2:48pm 
Diposting pertama kali oleh Azza ☠:
Diposting pertama kali oleh 8==========D:
dont take the persons comment seriously my current system already generates 400+ fps on csgo on 1440p . my concern is Never CSGO that game doesnt need anything to run at high fps my real concern is actual ARK 2 which is supposed to come out and run on Unreal Engine 5 and after watching RTX 4090 on Fortnite unreal engine 5 gameplay it convinced me this new generation of build will NOT get me 120 fps on 4k native despite me going for that exact build . No one builds PC for 1 game . Ark is something i struggle to play at a good high fps with high settings and that was my main goal. The reality is With a new game engine and high end systems it can NOT run the game at 4k 120 fps Native.

Yeah alright.

I will tell you what I know...

RTX 4090 is babe
Unreal 5.1 Engine is babe
Quixel Ultra-Realistic Megascans is babe

https://www.youtube.com/watch?v=U7FupEUw5q0

https://www.youtube.com/watch?v=Wj48Yqnm0ZM

https://www.youtube.com/watch?v=ca2ME4Wy0eM

DLSS 3 gives you the support needed to run 4K + HDR + Ray Tracing extremely smoothly and I have yet to find a problem or graphical glitch with it.

I don't have any issues with the RTX 4090 being able to drive 50-60 FPS Native and 90-120 FPS or higher with DLSS 3, when the graphics quality is all that eye candy.

Plus the game development of all those would also be much faster or given time to be well polished.

For Ark 2...

https://store.steampowered.com/app/2050420/ARK_2/

Notice how the video trailer says "real-time footage" *hint* *hint*

You will have photorealistic quality at 4K with real-time ambient lighting & shadows. Do you want your cake and eat it too?
the blank empty biome with no anti aliasing and no creatures spawning with other people and bases + water and grass quality is always good to show off the card and pretend thats what it will be like in ARK 2 . The reality is 50 fps if youre lucky and theres no one else in the server at 4k 120 fps . if you have people and particles and creature spawns all at once while peoples bases pop up it changes the fps to like 1 or 2 fps at 4k . im buying a card for 2500$ that is legit designed for 1440p while "Pretending" it actually is designed for 4k . thats the end of story. I have no problem with the card. I have problems with Manufacturers Lying publicly. Thats a 1440p card at optimal performance.
Not all games made the same, render same, or optimize the same, as have games using various rendering from DX11/12, OpenGL, or Vulkan which can either be minor differences or major differences in FPS.

I don't know a ton about Unreal engine 5.1, but also it's not the 1st time if another Dev, or Studio does a better job than the creator of said game engine if it happens, but note that if you're blaming a game engine, well might want to think about more, what could cause FPS to be lower, or higher after said changes, what exactly would it make it more, or less hardware demanding that what you have to figure out, for all you know they could've added more effects, or changes, or something that cause your FPS to go up, or down, and for all you know they might not have fully optimize it either if by chance, as sometimes devs / studio's might swing back to doing optimization, and could greatly, or minor improve hardware utilisation, and efficiency.


Now just to point out, if your goal is to always have the latest bleeding tech, it's a given you're burning money everytime, or so something greater comes around because you "wanted the best" or whatever reason.

Ray-Tracing is super hardware intensive, where it tanks your FPS pretty hard, that where DLSS / FSR comes in to help boost FPS, to try mitigate RT that HW intensive, if you want the best looking possible game, using new game engines that hardware intensive, think about what to draw back to reach said target FPS, as may have to compromise somewhere.

Also don't believe they ever targeted only Unreal Engine 5.1 so kind of mute point expecting it will do something only for said game engine, when there other versions, and other game engines, so to put it nicely don't get your hopes up, as when we start pushing for more graphical demands, it common sense that newer, and more powerful GPU's, are only way to meet demands to handing certain tasks. Yes this is a endless loop, wait for more powerful GPU to release to do X reasons, or want it for Y reasons, Z reasons happens, now back at X or Y reasons, just look back what has been going on for last 30 years, and you see what I mean.
Diposting pertama kali oleh Azza ☠:
Some would rather stick to using the faked old school shadows and shaders, rather than real path and ray tracing. Probably should also stay away from 4K resolution for the moment too.
I don't understand why you're so vested in telling people they're wrong for having a preference you don't share.

Let's be real for a moment. The only "real" lighting coming from our displays is the backlight itself. The rest is just a frame composed of a given color arrangement based on how something processed and told it to arrange those colors. So can we just stop with the pretentious "real" stuff? All it is, is different methods to render what imitates light in a way that behaves MORE realistically. That's it. It's progress. And that's great, but there's ALWAYS been something heralded as "real time". First it was Doom with "real time lighting". Then I'm recalling Crysis making some claims it was doing something, some sort of lighting or shadows (I forget) in "'real time". Now ray tracing is "real time" and if you don't like this frame generation tech it sometimes needs as a crutch, you're wrong because you accept "some fake stuff but not others"? Give me a break.

I'm not saying ray tracing isn't better. Nor am I saying it's wrong to enjoy ray tracing (that's silly). I'm not saying it's wrong to appreciate DLSS 3 either. I'm just saying, for some reason, you seem to have a VERY vested interested in telling people they should like it the way you do, to the extent that you're making strawman arguments to posit contradictions that aren't there, and it's dishonest. You don't even need to do that to justify your appreciation for the things you do. Just... naturally appreciate them. I'm guessing a lot of this comes from the fact that you apparently have an RTX 4090 and enjoy it. Well, enjoy it (not meant sarcastically, but genuinely). I'm sure both ray tracing and maybe even DLSS 3 will bring good things to many. I'm just not sure why you feel the need to push this. Just enjoy it and let others do things their way. Or more simply, "live and let live".
Terakhir diedit oleh Illusion of Progress; 23 Jan 2023 @ 3:36pm
Azza ☠ 23 Jan 2023 @ 3:48pm 
Diposting pertama kali oleh Illusion of Progress:
Diposting pertama kali oleh Azza ☠:
Some would rather stick to using the faked old school shadows and shaders, rather than real path and ray tracing. Probably should also stay away from 4K resolution for the moment too.
I don't understand why you're so vested in telling people they're wrong for having a preference you don't share.

Let's be real for a moment. The only "real" lighting coming from our displays is the backlight itself. The rest is just a frame composed of a given color arrangement based on how something processed and told it to arrange those colors. So can we just stop with the pretentious "real" stuff? All it is, is different methods to render what imitates light in a way that behaves MORE realistically. That's it. It's progress. And that's great, but there's ALWAYS been something heralded as "real time". First it was Doom with "real time lighting". Then I'm recalling Crysis making some claims it was doing something, some sort of lighting or shadows (I forget) in "'real time". Now ray tracing is "real time" and if you don't like this frame generation tech it sometimes needs as a crutch, you're wrong because you accept "some fake stuff but not others"? Give me a break.

I'm not saying ray tracing isn't better. Nor am I saying it's wrong to enjoy ray tracing (that's silly). I'm not saying it's wrong to appreciate DLSS 3 either. I'm just saying, for some reason, you seem to have a VERY vested interested in telling people they should like it the way you do, to the extent that you're making strawman arguments to posit contradictions that aren't there, and it's dishonest. You don't even need to do that to justify your appreciation for the things you do. Just... naturally appreciate them. I'm guessing a lot of this comes from the fact that you apparently have an RTX 4090 and enjoy it. Well, enjoy it (not meant sarcastically, but genuinely). I'm sure both ray tracing and maybe even DLSS 3 will bring good things to many. I'm just not sure why you feel the need to push this. Just enjoy it and let others do things their way. Or more simply, "live and let live".

I'm not vested into that, rather more into the game development and what's required to be able to juice something like that. Honestly, I want to push the card towards developers, more than just the gamers. Gamers can pick to stick at the 1440p sweet spot, upgrading later on as desired. 4K is just the new norm to come for future next gen PC gaming and there's a clear path to be able to get there and speed up that development too. DLSS 3 is like a free performance upgrade for both new and old games, that wish to patch it in, doubling the FPS without quality lost, rather a quality gain in most cases.

There's a lot more involved, which people seem to either take for granted or have never even seen and discard.

For example: 4K resolution is four times the pixels of 1080p resolution, so you can expect it to take at least four times the graphic card juice to drive it. Previously you would of needed multiple graphic cards just to drive it (SLI). Throw onto that Ray Tracing and everything else, you use to expect the FPS to drop accordingly and would likely be unplayable in the past.

Now in Doom Eternal at 4K on Ultra settings, for example, the average rate for the RTX 4090 is estimated at a whopping 400 FPS. Optimized games run extremely sweet if on the latest Nvidia drivers.

It will all depend on what games you play and if maxed setting. The RTX 4090 has an average of 127 FPS, while the RTX 3090 Ti had an average of 88 FPS. That's a 30% performance leap in raw performance. DLSS 3 is just something extra on top of that which doubles it. I don't think I've play a game yet going below 45 FPS.

Even without DLSS, the RTX 4090 almost manages to deliver native 4K ultra ray tracing performance of 60 fps or more. It's smooth as butter, if you have a 4K monitor with HDR and a G-SYNC module. If you are going to fork out for a high-end graphics card, ensure the monitor is one too.
Terakhir diedit oleh Azza ☠; 23 Jan 2023 @ 4:02pm
A&A 23 Jan 2023 @ 4:18pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh 8==========D:
your question implies No other games besides fortnite uses unreal engine 5.1 so "how many games use unreal engine 5.1?" it is indeed a statement
Also you asked ''CURRENTLY USING UE 5.1" so are you saying i should only worry about whats "CURRENT" with a 3000$ GPU or expect it to run future titles with new engines at decent fps when it cant even handle latest engine at 60 fps
My laptop with Ryzen 5 5500U and Vega 6 donesn't have a problem to run Mount and Blade warband, max settings 4K at 40FPS, so does it mean l can run Cyberpunk at 4K, max settings and still having 40FPS :D

Also it doesn't matter which engines you are using, because they are just tools, with them you can render graphic elements that can run at 900FPS or 3FPS
build boarding grass vs Geometry shader grass
Terakhir diedit oleh A&A; 23 Jan 2023 @ 4:22pm
Tiberius 23 Jan 2023 @ 5:11pm 
Diposting pertama kali oleh Azza ☠:
Diposting pertama kali oleh Tiberius:

I still dont. Which pvp game that rtx 4090 cant run at 120fps 4k? Using experimental feature like lumen isnt really an indication what a gpu is/isnt capable of

Besides, op only play cs:go. He's acting like that game require a nasa cpu or something

They are complaining about the AI frame generation technology used to be able to run 4K at 120Hz smoothly without any stutter or issues, even with ray tracing enabled and it doing it all in real-time. Calling it as being fake and discarding the concept.

Some would rather stick to using the faked old school shadows and shaders, rather than real path and ray tracing. Probably should also stay away from 4K resolution for the moment too.

ps: For CS:GO, I would likely get 560 FPS up to 1000+ FPS on the RTX 4090 at 4K resolution (if not capping the FPS to a max limit)... Rainbow Six Siege on Ultra Quality 4K gets between 336 min to an average of 383 FPS. Why would you want to play crappy old games like that on 4K, no idea. Perhaps some people are just still stuck in the past and not wanting to progress or at least have a stepping stone on the way to a massive leap in progress?

Its a problem with ray tracing than with frame generation, cos no one is really using frame generation when the rt setting is off. Op has no idea how demanding rt is (or lumen which is just less accurate rt with more options). It used to require hours to render rt, but now we can do it in real time.

And tbh, no one should use rt on a pvp game. You basically impair your vision by having darker shadow and no fake ambient lighting.
8========~D 23 Jan 2023 @ 5:13pm 
Diposting pertama kali oleh A&A ✠:
Diposting pertama kali oleh 8==========D:
Also you asked ''CURRENTLY USING UE 5.1" so are you saying i should only worry about whats "CURRENT" with a 3000$ GPU or expect it to run future titles with new engines at decent fps when it cant even handle latest engine at 60 fps
My laptop with Ryzen 5 5500U and Vega 6 donesn't have a problem to run Mount and Blade warband, max settings 4K at 40FPS, so does it mean l can run Cyberpunk at 4K, max settings and still having 40FPS :D

Also it doesn't matter which engines you are using, because they are just tools, with them you can render graphic elements that can run at 900FPS or 3FPS
build boarding grass vs Geometry shader grass
seems like you missed the point. The point is the newest game engines and RTX 4090 with latest CPU does NOT equal to 4k 120 fps at highest settings. Sure if you go back years and use old game engines itll get 4k 120 but new games upcoming are using NEW GAME ENGINES which gurantees you not even 60 fps...
Komarimaru 23 Jan 2023 @ 5:43pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh A&A ✠:
My laptop with Ryzen 5 5500U and Vega 6 donesn't have a problem to run Mount and Blade warband, max settings 4K at 40FPS, so does it mean l can run Cyberpunk at 4K, max settings and still having 40FPS :D

Also it doesn't matter which engines you are using, because they are just tools, with them you can render graphic elements that can run at 900FPS or 3FPS
build boarding grass vs Geometry shader grass
seems like you missed the point. The point is the newest game engines and RTX 4090 with latest CPU does NOT equal to 4k 120 fps at highest settings. Sure if you go back years and use old game engines itll get 4k 120 but new games upcoming are using NEW GAME ENGINES which gurantees you not even 60 fps...
You finally answered your own question. There's your answer. Technology progresses, engines offer more demanding options, hardware needs to rush to keep up.

Now you can quit crying.
Revelene 23 Jan 2023 @ 5:45pm 
Software and hardware are not co-developed. Software always pushes the boundaries of hardware.

4K 120FPS max settings is your own pipe dream. Never has this been a standard, and it never will be anytime soon.

You can change your settings to meet your performance expectations. This has always been the case with PC. You seriously talk like this is your first time with PC and came in with hyped expectations.
Azza ☠ 23 Jan 2023 @ 5:45pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh A&A ✠:
My laptop with Ryzen 5 5500U and Vega 6 donesn't have a problem to run Mount and Blade warband, max settings 4K at 40FPS, so does it mean l can run Cyberpunk at 4K, max settings and still having 40FPS :D

Also it doesn't matter which engines you are using, because they are just tools, with them you can render graphic elements that can run at 900FPS or 3FPS
build boarding grass vs Geometry shader grass
seems like you missed the point. The point is the newest game engines and RTX 4090 with latest CPU does NOT equal to 4k 120 fps at highest settings. Sure if you go back years and use old game engines itll get 4k 120 but new games upcoming are using NEW GAME ENGINES which gurantees you not even 60 fps...

https://www.youtube.com/watch?v=5nNTVOghjFs

Fixed the game engine:

https://youtu.be/NSr2QBjmQCM
8========~D 23 Jan 2023 @ 5:48pm 
Diposting pertama kali oleh Revelene:
Software and hardware are not co-developed. Software always pushes the boundaries of hardware.

4K 120FPS max settings is your own pipe dream. Never has this been a standard, and it never will be anytime soon.

You can change your settings to meet your performance expectations. This has always been the case with PC. You seriously talk like this is your first time with PC and came in with hyped expectations.
Thats totally fine but the fact that Nvidia them selves advertise this card on their own webpage as an optimal 4k card even on Geforce experience shows how much theyre Lying . Its a Native 1440p card at high fps and high settings and ONLY 4K if you use DLSS ... why cant they just advertise it for what it truly is?
Komarimaru 23 Jan 2023 @ 5:55pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh Revelene:
Software and hardware are not co-developed. Software always pushes the boundaries of hardware.

4K 120FPS max settings is your own pipe dream. Never has this been a standard, and it never will be anytime soon.

You can change your settings to meet your performance expectations. This has always been the case with PC. You seriously talk like this is your first time with PC and came in with hyped expectations.
Thats totally fine but the fact that Nvidia them selves advertise this card on their own webpage as an optimal 4k card even on Geforce experience shows how much theyre Lying . Its a Native 1440p card at high fps and high settings and ONLY 4K if you use DLSS ... why cant they just advertise it for what it truly is?
It's the most powerful 4k card on the market, they are not lying. You're the one that made up fake standards crybaby.
Revelene 23 Jan 2023 @ 6:03pm 
Diposting pertama kali oleh 8==========D:
Diposting pertama kali oleh Revelene:
Software and hardware are not co-developed. Software always pushes the boundaries of hardware.

4K 120FPS max settings is your own pipe dream. Never has this been a standard, and it never will be anytime soon.

You can change your settings to meet your performance expectations. This has always been the case with PC. You seriously talk like this is your first time with PC and came in with hyped expectations.
Thats totally fine but the fact that Nvidia them selves advertise this card on their own webpage as an optimal 4k card even on Geforce experience shows how much theyre Lying . Its a Native 1440p card at high fps and high settings and ONLY 4K if you use DLSS ... why cant they just advertise it for what it truly is?

Where exactly on the product page do you see this supposed claim?
Terakhir diedit oleh Revelene; 23 Jan 2023 @ 6:05pm
< >
Menampilkan 91-105 dari 616 komentar
Per halaman: 1530 50

Tanggal Diposting: 20 Jan 2023 @ 3:33pm
Postingan: 616