Dette emnet er blitt låst
8=========D~ 20. jan. 2023 kl. 15.33
RTX 4090 is not for Native 4K gaming with the latest game engine
with a huge price tag of 3000$ (if youre lucky 2000$ + tax 2400$ Approximately) almost for a card its only able to barely touch 4K 60 fps on the Latest Game engine UNREAL ENGINE 5.1 ? so is it safe to assume ALL gpu's are designed to run games at 1440p and we should all be OK paying 3000$ just to be satisfied at 1080p and 1440p ?? it was not too long ago i remember a GTX 1080 TI was able to run games at 4k 60 yes it was based on games from that generation but shouldnt a GPU thats released now be able to run all games at atleast 4k 120 fps with the latest game engine IF its the top end GPU of current generation??

Sure if you use old generation engines it does reach that fps but what about the future games using the latest game engine ?? are we expected to pay 3000$ every year for the next gpu ??

Here is the benchmark and keep in mind i am only referring to NATIVE 4K not DLSS : https://www.youtube.com/watch?v=dr7LpP7Dm4E


Update January 27 2023 : Newly released game Forespoken running at 4K Native at 43 FPS its not even been a year since 4090 release and this is its performance : https://www.youtube.com/watch?v=U0u9l4Wkh9s

Update February 07 2023

Newly released game as of the date hogwarts legacy doesnt even get 60 fps : https://www.youtube.com/watch?v=5dKUpcMckBg

Lets make sure to blame every single game developer from all different companies and pretend the RTX 4090 is actually a 4k card and all developers are doing everything wrong from all studios and all companies. Its DEFINITELY NOT NVIDIA pretending and Lying publicly about RTX 4090 being a true "Native 4k" GPU.

Lets all keep pretending.
Sist redigert av 8=========D~; 7. feb. 2023 kl. 15.16
< >
Viser 121135 av 616 kommentarer
A&A 24. jan. 2023 kl. 5.02 
Opprinnelig skrevet av 8==========D:
Opprinnelig skrevet av Komarimaru:
There are plenty of games throughout PC history where the current generation hardware could not max the game. The whole "Can it run Crysis" meme was real. Cyberpunk 2077 is another more modern example.

Just admit the simple truth that you want enthusiast level hardware but can't afford the hobby of it. Why it's called enthusiast.

Just stick to mid tier, and adjust settings as needed. You don't need to get into the expensive side of the hobby to enjoy the games.
you changed topic to affordability and lots of assumptions with no evidence even though i typed multiple times i was aiming for i9 13900k + RTX 4090 with a gigabyte M32U monitor full 4k 120fps build even in my other 3 posts . Your assumptions failed your logic failed and on top of that you managed to deflect the topic 3 different directions unsuccessfully. You tried but try harder next time.
Your logic is non existent or absurd!

How many games are using Unity 5.1 right now?

Fortnite at 4K NATIVE EPIC, 60-70FPS

This unreal 5.1 city sample with average 90 fps
https://youtu.be/E-KFjYu637k

Far Cry 6 on CryEngine, yes the same ♥♥♥♥♥♥♥ engine which was used in Crysis, 90-100FPS 4K Native
https://youtu.be/rHAwoKEDH54

Explain me, how this is not 4K ready?


Or explain me this? Average 145FPS on 13 games...
https://youtu.be/aQklDR8nv8U

How many frames per second you need? If l give you 360Hz monitor, you will want all these 360 frames, but do you even know how big is the difference between 100, 120 and 360?
Sist redigert av A&A; 24. jan. 2023 kl. 5.19
Dr.Shadowds 🐉 24. jan. 2023 kl. 5.45 
Just in case OP somehow miss this post.
https://steamcommunity.com/discussions/forum/11/3761102779889269403/?ctp=8#c3761103203680789085


Opprinnelig skrevet av A&A ✠:
Opprinnelig skrevet av 8==========D:
you changed topic to affordability and lots of assumptions with no evidence even though i typed multiple times i was aiming for i9 13900k + RTX 4090 with a gigabyte M32U monitor full 4k 120fps build even in my other 3 posts . Your assumptions failed your logic failed and on top of that you managed to deflect the topic 3 different directions unsuccessfully. You tried but try harder next time.
Your logic is non existent or absurd!

How many games are using Unity 5.1 right now?

Fortnite at 4K NATIVE EPIC, 60-70FPS

This unreal 5.1 city sample with average 90 fps
https://youtu.be/E-KFjYu637k

Far Cry 6 on CryEngine, yes the same ♥♥♥♥♥♥♥ engine which was used in Crysis, 90-100FPS 4K Native
https://youtu.be/rHAwoKEDH54

Explain me, how this is not 4K ready?


Or explain me this? Average 145FPS on 13 games...
https://youtu.be/aQklDR8nv8U

How many frames per second you need? If l give you 360Hz monitor, you will want all these 360 frames, but do you even know how big is the difference between 100, 120 and 360?
And this ↑↑↑ post is also spot on, as either OP has an unrealistic expectations, or misunderstands how things worked for last 30 years when PC hardware release, and people market things. Truth be told if a product such as GPU, get marketed towards a game, it's not wise to ignore the minor details such as game settings, game version, and what the current state of the game before future changes as that what normally what people market games or GPU for giving said minor details, example Fortnite ALWAYS keep getting updates, content added, and it's a given that it's not going to remain the same.
A&A 24. jan. 2023 kl. 6.17 
@Dr.Shadows
I agree, this is why l am not giving a link of fortnite, because the content, possible optimizations, APIs and engines changes are very denamic, but these results are my point of view from the tests l saw with Unreal 5.1
Revelene 24. jan. 2023 kl. 7.02 
I will ask again OP, where exactly on the product page do you see this supposed claim?
Illusion of Progress 24. jan. 2023 kl. 8.09 
Opprinnelig skrevet av Azza ☠:
I'm not vested into that...
Then I guess I'm missing your overall intention with what you're saying.

As I said, enjoy what you want. I'm not trying to tell you're wrong for having enthusiasm for whatever you do.

My contention was more how you were calling it hypocritical to enjoy one thing (basically, enjoying things without ray tracing) while not being a raving fan of another thing (DLSS 3 and the like) and it rubbed me wrong because there's nothing hypocritical about that. Many of us are just fine appreciating what we can get. Heck, I'm still in awe at Minecraft of all things with shaders (not RTX, but shaders). It's inefficient in performance by adding a shader pipeline to an already inefficient performing game that lacks one, and you get those "fake results" as you say, but I still am just in awe at the results some times.

Now if you're in a position like OP is where "I want to play at 4K, get ray tracing, and maintain hundreds of FPS in everything" then yeah maybe one needs to consider DLSS 3 as performance growth hasn't kept pace with demands. I would agree there as I've made that same point to OP earlier in the thread.
Opprinnelig skrevet av Azza ☠:
For example: 4K resolution is four times the pixels of 1080p resolution, so you can expect it to take at least four times the graphic card juice to drive it.
Not always are things linear and based on just one factor, though I'm sure you know this.

If ONLY pixel fill rate (I think this would be the right thing?) is the limitation and that ALSO remains the same limitation all the way up to quadrupling the pixel count, then sure, it might be the case that you need four times the performance to increase the pixel count four times and keep the same performance, but that's not always the case. Often times you might need less than four times the processing speed to drive 4K over 1080p (and maybe other times, more?).

And the inverse can be true. Sometimes you need more than you might think if you just look at how one factor changes. Often in Minecraft, people underestimate that doubling the render distance does not double the demands, but it can often increase it by more than double as you're actually asking four times the chunks to be handled (which actually might be more or even less in reality due to things like culling and such, but still).
8=========D~ 24. jan. 2023 kl. 8.51 
Opprinnelig skrevet av A&A ✠:
Opprinnelig skrevet av 8==========D:
you changed topic to affordability and lots of assumptions with no evidence even though i typed multiple times i was aiming for i9 13900k + RTX 4090 with a gigabyte M32U monitor full 4k 120fps build even in my other 3 posts . Your assumptions failed your logic failed and on top of that you managed to deflect the topic 3 different directions unsuccessfully. You tried but try harder next time.
Your logic is non existent or absurd!

How many games are using Unity 5.1 right now?

Fortnite at 4K NATIVE EPIC, 60-70FPS

This unreal 5.1 city sample with average 90 fps
https://youtu.be/E-KFjYu637k

Far Cry 6 on CryEngine, yes the same ♥♥♥♥♥♥♥ engine which was used in Crysis, 90-100FPS 4K Native
https://youtu.be/rHAwoKEDH54

Explain me, how this is not 4K ready?


Or explain me this? Average 145FPS on 13 games...
https://youtu.be/aQklDR8nv8U

How many frames per second you need? If l give you 360Hz monitor, you will want all these 360 frames, but do you even know how big is the difference between 100, 120 and 360?
The whole point of conversation is "can it hold on to high fps for new releases with new game engines for atleast 1-2 years or are we hoping all games that will be released within next 1-2 year will use older game engines instead so we can get 4k 120 fps? what about the future year?"
Komarimaru 24. jan. 2023 kl. 10.59 
Opprinnelig skrevet av 8==========D:
Opprinnelig skrevet av A&A ✠:
Your logic is non existent or absurd!

How many games are using Unity 5.1 right now?

Fortnite at 4K NATIVE EPIC, 60-70FPS

This unreal 5.1 city sample with average 90 fps
https://youtu.be/E-KFjYu637k

Far Cry 6 on CryEngine, yes the same ♥♥♥♥♥♥♥ engine which was used in Crysis, 90-100FPS 4K Native
https://youtu.be/rHAwoKEDH54

Explain me, how this is not 4K ready?


Or explain me this? Average 145FPS on 13 games...
https://youtu.be/aQklDR8nv8U

How many frames per second you need? If l give you 360Hz monitor, you will want all these 360 frames, but do you even know how big is the difference between 100, 120 and 360?
The whole point of conversation is "can it hold on to high fps for new releases with new game engines for atleast 1-2 years or are we hoping all games that will be released within next 1-2 year will use older game engines instead so we can get 4k 120 fps? what about the future year?"
Honestly, if you're that worried, wait. Each generation leap has been massive the past few years. Wait until the 5090 launches, if it has a big a boost as the 4090 did, then you'll be safe a few years without worrying about money for a year or two.
emoticorpse 24. jan. 2023 kl. 11.19 
I don't get it either, but I don't just blame gpu manufacturers. If developers of great looking games years ago could do it with worse hardware and worse gaming engines, I don't see why it isn't done today. I guess it's because graphics come first. They come before first. Graphics are the entire list and gameplay/story/originality/mechanics come last if they come at all. Sure I'm exaggerating a bit, but I do think if they were to balance out the gameplay/story/everything else I was implying then they wouldn't need to worry about saving it by introducing raytracing or other graphical "extras" (heavy extras at that, with the amount of a toll they put on your hardware). I can't be mad at raytracing either. I am pretty sure games started losing interest to me before that. Although my interest in Quake II was re-ignited when it was re-done to support raytracing, I think that was because in the case of Quake II it was because those graphics were so poor. Any game conceived in the last few years should have healthy enough graphics to not look like crap on modern pcs, but some developers choose to pile on the graphics like an influencer puts on the makeup and filters.

But it is also in part with gamers themselves. Because some games are out and ready to do 4k/raytracing whatever, some gamers have a mindset where "it's my civil right to play this game on max settings but I don't have the hardware to do it WTF!, heads will roll!". GPU prices are outrageous but middle tier cards should still be able to run it half way decent looking. If it takes raytracing at 4k to visually please you, you should start looking internally to see if there's an issue there while you're at it.

Edit: Dang, I just watched some of that 4k Forntite video and it does look impressive. That much detail would need a strong system. I can't be mad at that requireing a 4090. Looks like every single piece of grass and leave has it's own shadow that is dynamically rendered. I don't think that's really happening, but the fact that it appears something like that at first glance is cool. Only thing that comes to mind for me that looks like that is Witcher 3. Man those colors are vibrant too.

These games are going to do to me what 4k video did to dvd and regular bluray. I can't appreciate them like I used to.
Sist redigert av emoticorpse; 24. jan. 2023 kl. 11.50
Ghost Robertson 24. jan. 2023 kl. 12.46 
I don't mean to offend but some people seem addicted to eye candy, I recently upgraded two of my monitors and the two differences are just a higher Hz and better picture, But both are still 1080p.
DeadBeat 24. jan. 2023 kl. 13.15 
Opprinnelig skrevet av PopinFRESH:
trying to farm Jester awards.

If so it doesn't seem to working. :steammocking:
DeadBeat 24. jan. 2023 kl. 13.22 
Opprinnelig skrevet av Ghost Robertson:
I don't mean to offend but some people seem addicted to eye candy, I recently upgraded two of my monitors and the two differences are just a higher Hz and better picture, But both are still 1080p.

What would be the point of buying a 4090 to game on a 1080p monitor though? If you're satisfied with 1080p that's fine but I guarantee you if you were to game on a 4k monitor for six months and switched back to 1080p you'd notice the difference right away and probably wouldn't like it.
A&A 24. jan. 2023 kl. 13.29 
1000FPS in csgo
This is the point of 4090 and 1080p :)
Sist redigert av A&A; 24. jan. 2023 kl. 13.31
Dr.Shadowds 🐉 24. jan. 2023 kl. 13.57 
Opprinnelig skrevet av DeadBeat:
Opprinnelig skrevet av Ghost Robertson:
I don't mean to offend but some people seem addicted to eye candy, I recently upgraded two of my monitors and the two differences are just a higher Hz and better picture, But both are still 1080p.

What would be the point of buying a 4090 to game on a 1080p monitor though? If you're satisfied with 1080p that's fine but I guarantee you if you were to game on a 4k monitor for six months and switched back to 1080p you'd notice the difference right away and probably wouldn't like it.
IMO if someone gonna buy new monitors that high refresh rate might as well go for 1440p if got such a high end GPU, unless after for better visuals over high refreshrate really. Also depending how far you sit away from monitor, and how big it is can factor how one may look better over the other, like if have both 1440p vs 4k at 24" might not notice by much depending on the person, now if after go big go home monitors like 30" ~ 50" at that point might as well go 4k at that point.

Now if after like crazy refreshrate 240hz, or higher you would be looking at 1080p or 1440p, since 4k is not an option for that high refreshrate at least for now, there is few but prices are way higher than your avg 4k monitor, as can get 1440p that 240hz that about ~1/3 what cost to getting 240hz 4k, or about ~1/5 if 1080p 240hz basically as see prices scale pretty hefty.
DeadBeat 24. jan. 2023 kl. 14.09 
Opprinnelig skrevet av Dr.Shadowds 🐉:
IMO if someone gonna buy new monitors that high refresh rate might as well go for 1440p if got such a high end GPU, unless after for better visuals over high refreshrate really.

I prefer the better visuals and if I can get higher frame rates I consider that a bonus. My first 4k monitor is 4 yrs old and when it went down I used a 1440p monitor temporarily and the difference in visual quality to me was quite noticeable.
Dr.Shadowds 🐉 24. jan. 2023 kl. 14.17 
Opprinnelig skrevet av DeadBeat:
Opprinnelig skrevet av Dr.Shadowds 🐉:
IMO if someone gonna buy new monitors that high refresh rate might as well go for 1440p if got such a high end GPU, unless after for better visuals over high refreshrate really.

I prefer the better visuals and if I can get higher frame rates I consider that a bonus. My first 4k monitor is 4 yrs old and when it went down I used a 1440p monitor temporarily and the difference in visual quality to me was quite noticeable.
For long period of time usage can be noticeable, but the more you use it your brain will phase it if don't go back to 4k. The only issue is like I said how far / close you sitting to display, and how big the display as DPI matter in that area.
Sist redigert av Dr.Shadowds 🐉; 24. jan. 2023 kl. 14.17
< >
Viser 121135 av 616 kommentarer
Per side: 1530 50

Dato lagt ut: 20. jan. 2023 kl. 15.33
Innlegg: 616