Bu konu kilitlenmiştir.
RTX 4090 is not for Native 4K gaming with the latest game engine
with a huge price tag of 3000$ (if youre lucky 2000$ + tax 2400$ Approximately) almost for a card its only able to barely touch 4K 60 fps on the Latest Game engine UNREAL ENGINE 5.1 ? so is it safe to assume ALL gpu's are designed to run games at 1440p and we should all be OK paying 3000$ just to be satisfied at 1080p and 1440p ?? it was not too long ago i remember a GTX 1080 TI was able to run games at 4k 60 yes it was based on games from that generation but shouldnt a GPU thats released now be able to run all games at atleast 4k 120 fps with the latest game engine IF its the top end GPU of current generation??

Sure if you use old generation engines it does reach that fps but what about the future games using the latest game engine ?? are we expected to pay 3000$ every year for the next gpu ??

Here is the benchmark and keep in mind i am only referring to NATIVE 4K not DLSS : https://www.youtube.com/watch?v=dr7LpP7Dm4E


Update January 27 2023 : Newly released game Forespoken running at 4K Native at 43 FPS its not even been a year since 4090 release and this is its performance : https://www.youtube.com/watch?v=U0u9l4Wkh9s

Update February 07 2023

Newly released game as of the date hogwarts legacy doesnt even get 60 fps : https://www.youtube.com/watch?v=5dKUpcMckBg

Lets make sure to blame every single game developer from all different companies and pretend the RTX 4090 is actually a 4k card and all developers are doing everything wrong from all studios and all companies. Its DEFINITELY NOT NVIDIA pretending and Lying publicly about RTX 4090 being a true "Native 4k" GPU.

Lets all keep pretending.
En son 8========~D tarafından düzenlendi; 7 Şub 2023 @ 15:16
< >
616 yorumdan 481 ile 495 arası gösteriliyor
İlk olarak 8==========D tarafından gönderildi:
https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+RTX+4090&id=4606

"Video card Value" Price to performance ratio all the way to bottom. As expected.
Oh no, it's not as efficient in price as a GTX 1650! Or a GTX 1070! Or a 2060 or 3060! Whatever will someone do?!

Your argument keeps killing itself. As you post things to try and look as if you've an edge, for it to be self destructive.
En son Komarimaru tarafından düzenlendi; 1 Şub 2023 @ 9:07
İlk olarak 8==========D tarafından gönderildi:
https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+RTX+4090&id=4606

"Video card Value" Price to performance ratio all the way to bottom. As expected.
If it was priced by half what it sold in Canada, you still rant about it either way since you been arguing with people to demand others to agree with your opinion that can't do 4k, and you were saying nvidia are liars when they never promise, or promoted anything to giving you 4k 120fps in fortnite new game engine replacement update, and you been dismissing every single flaw to your argument just because didn't agree with your opinion.

If you want bleeding edge flag ship tech, obviously it not gonna be cheap, nor will it be best dollar to performance when comes to other cards on the market. I bet you ignore this too, or say something even dumber either way, like you have been doing whenever someone point out your flaw in your bad arguments, proof of it self is within the last 490 posts of this discussion :littleclown:
https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+RTX+3090&id=4284

Side by side comparion of the problem, why shouldn't throw money at best cards as soon they release like a clown, and expect unrealistic things when never promise in the 1st place.
https://imgur.com/OTttX77


https://youtu.be/zjedLeVGcfE
En son Dr.Shadowds 🐉 tarafından düzenlendi; 1 Şub 2023 @ 11:04
Benchmarks for Nvidia seem skewed to me at this point and the entire benchmarks system needs to be re-vamped for clarification.

I mean at this point any benchmark I look at I have to wonder before I even look at it...Is this using DLSS 3.0/fake frames (to help Nvidia look good) and does this have raytracing disabled (to help Nvidia look good). Because with raytracing LITERALLY being in the name of Nvidia's line of GPU's, they imply raytracing is practical. So for them to pitch any kind of benchmarks/performance benchmarks with a "raytracing disabled bros!" makes zero sense to me.
And seem OP just got banned lol.


İlk olarak emoticorpse tarafından gönderildi:
Benchmarks for Nvidia seem skewed to me at this point and the entire benchmarks system needs to be re-vamped for clarification.

I mean at this point any benchmark I look at I have to wonder before I even look at it...Is this using DLSS 3.0/fake frames (to help Nvidia look good) and does this have raytracing disabled (to help Nvidia look good). Because with raytracing LITERALLY being in the name of Nvidia's line of GPU's, they imply raytracing is practical. So for them to pitch any kind of benchmarks/performance benchmarks with a "raytracing disabled bros!" makes zero sense to me.
My advice take benchmark with grain of salt, compare them yourself as people will very likely do benchmark videos with the cards, and can even ask people that has them if they're willing to test things with said settings.

IMO I do think some benchmarks might include DLSS enable.
En son Dr.Shadowds 🐉 tarafından düzenlendi; 1 Şub 2023 @ 11:39
So how do you calculate frames per dollar if the frames are fake -_- ?

They don't expect us to pay hundreds of dollars for fake frames now do they? I have to admit, I'm highly surprised that people are actually buying these things and falling for this tech.

Nice job for Nvidia though, getting people to actually buy into this. It's not going to work on me though.
İlk olarak Hardʬare Hero tarafından gönderildi:
So how do you calculate frames per dollar if the frames are fake -_- ?

They don't expect us to pay hundreds of dollars for fake frames now do they? I have to admit, I'm highly surprised that people are actually buying these things and falling for this tech.

Nice job for Nvidia though, getting people to actually buy into this. It's not going to work on me though.
You use a ratio to compare them just like you normally would, you bench all cards using same settings, same driver if possible as that be most idea thing to do when from same brand, or if compare Nvidia to AMD use both their newest driver at the time when done the test, and use same settings. This can be done with any benchmarking tool, or games as you please.

Test may need to do it multiple times to roll out any errors, and to get avg so know what going on incase something is a mishap.

AMD version to DLSS is FSR which is open source to all cards. Intel do have their own version call XESS.
İlk olarak emoticorpse tarafından gönderildi:
Benchmarks for Nvidia seem skewed to me at this point and the entire benchmarks system needs to be re-vamped for clarification.

I mean at this point any benchmark I look at I have to wonder before I even look at it...Is this using DLSS 3.0/fake frames (to help Nvidia look good) and does this have raytracing disabled (to help Nvidia look good). Because with raytracing LITERALLY being in the name of Nvidia's line of GPU's, they imply raytracing is practical. So for them to pitch any kind of benchmarks/performance benchmarks with a "raytracing disabled bros!" makes zero sense to me.

The offical Nvidia benchmarks however had shown both - DLSS 3 enabled and disabled with different colours, green and lime.

If they where trying to be tricky and hiding that fact, then sure you might have a case. However, it's the other way around, the post owner is enabling Ray Tracing and complaining it can't do 4K at a whopping 120 FPS (when 60 FPS it doesn't anyway is considered standard for PC) with those advance settings which tank any other graphics card as well, when you just don't enable DLSS 3 with it. You could claim without Ray Tracing, the standard preloaded shaders and shadows are all faked too you know? What is the true definition of being faked?

https://www.youtube.com/watch?v=2VGwHoSrIEU

Ray Tracing had started off quite rough, but these days look quite realistic and give you much better immersion. You can seriously tell the difference between "fake" shadows and real-time ray tracing, even with DLSS 3 enabled over that to "fake" additional frames. I much prefer ray tracing in certain games, such as an adventure game or horror with lots of HDR quality darkness and light rays, fire or torches, etc. It makes a world of different. Saying that, I would consider disabling it on competitive fast pace games, which you don't need to sit around watching so much eye candy. It has it's pros and cons much like anything else.
En son Azza ☠ tarafından düzenlendi; 1 Şub 2023 @ 13:03
İlk olarak Azza ☠ tarafından gönderildi:
İlk olarak emoticorpse tarafından gönderildi:
Benchmarks for Nvidia seem skewed to me at this point and the entire benchmarks system needs to be re-vamped for clarification.

I mean at this point any benchmark I look at I have to wonder before I even look at it...Is this using DLSS 3.0/fake frames (to help Nvidia look good) and does this have raytracing disabled (to help Nvidia look good). Because with raytracing LITERALLY being in the name of Nvidia's line of GPU's, they imply raytracing is practical. So for them to pitch any kind of benchmarks/performance benchmarks with a "raytracing disabled bros!" makes zero sense to me.

The offical Nvidia benchmarks however had shown both - DLSS 3 enabled and disabled with different colours, green and lime.

If they where trying to be tricky and hiding that fact, then sure you might have a case. However, it's the other way around, the post owner is enabling Ray Tracing and complaining it can't do 4K at a whopping 120 FPS (when 60 FPS it doesn't anyway is considered standard for PC) with those advance settings which tank any other graphics card as well, when you just don't enable DLSS 3 with it. You could claim without Ray Tracing, the standard preloaded shaders and shadows are all faked too you know? What is the true definition of being faked?

https://www.youtube.com/watch?v=2VGwHoSrIEU

Ray Tracing had started off quite rough, but these days look quite realistic and give you much better immersion. You can seriously tell the difference between "fake" shadows and real-time ray tracing, even with DLSS 3 enabled over that to "fake" additional frames. I much prefer ray tracing in certain games, such as an adventure game or horror with lots of HDR quality darkness and light rays, fire or torches, etc. It makes a world of different. Saying that, I would consider disabling it on competitive fast pace games, which you don't need to sit around watching so much eye candy. It has it's pros and cons much like anything else.

In the case of OP, I do think his reasons for the thread are primarily out of angst towards Nvidia purely based on the price/lack of performance. I don't really mind so much that, although it probably isn't good. And to define "lack of performance" isn't really saying they suck because the cards are good, it's just that yeah....they really can't handle 4k at 120 fps with raytracing and max settings for titles that actually matter, like Cyberpunk 2077, WItcher 3 and stuff like that.

So my bottom line is that I do perceive trickery and deception on Nvidia's part and they get away with it. Examples of those are sprinkled thoughout this thread I think mostly through pages 4-8 if I'm not mistaken.

If you don't see it (like others haven't) that's fine.I really don't see a firm promise of 4k 120 hz, but if I did this thread would have been over long ago.

So I'm not really on anybody's side but if I had to choose to be with Nvidia right now or against in the context of this thread I would be against them. I am re-researching 7900 XT performance also because I am still considering which GPU along with other system components I am going to get (I might be doing this for the next five years straight lol).
En son emoticorpse tarafından düzenlendi; 1 Şub 2023 @ 13:41
İlk olarak emoticorpse tarafından gönderildi:
...it's just that yeah....they really can't handle 4k at 120 fps with raytracing and max settings for titles that actually matter, like Cyberpunk 2077...

https://www.youtube.com/watch?v=glz5B-4IlKE

^ 21:50 Not just doing RT, doing a maxed out PsychoRT version of Cyberpunk 2077 at 4K getting 133fps.

Again, why are people hellbent on discounting DLSS but insist on including RT. "They are advertising RayTracing, its in the name"... like they aren't also advertising DLSS as a competitive advantage or feature of the RTX cards? There are definitely areas of improvement for DLSS3 but it is a massive step forward toward accelerating the move to fully path traced renderers that will only improve over time with newer hardware advancements as well as improvements in the DL model.

Even without DLSS3 its getting 88fps on the PsychoRT version of Cyberpunk 2077 at 4K. The regular RT max settings at 4K also gets over 100fps without DLSS3.
Guys, guys, guys. OP got banned. it's over. You can finally breathe.
If dlss gets better performance but looks like trash then I'm not going to use it. If dlss gives me better performance and I really can't tell the difference then hell ya I'm going to use it. It's not like you get a steam award for saying I didn't use fake frames today.
İlk olarak DeadBeat tarafından gönderildi:
If dlss gets better performance but looks like trash then I'm not going to use it. If dlss gives me better performance and I really can't tell the difference then hell ya I'm going to use it. It's not like you get a steam award for saying I didn't use fake frames today.

It doesn't look like trash, in most cases it looks better than the original or the same. I haven't even seen it glitch or look off yet. DLSS 3 would likely only ever glitch upon multiple lighting sources with fast pace multi-directional movements. So the AI frame generation would have to get confused about the path direction of pixels and place pixels off position. Plus if you are getting 144 FPS out of it and one of those frames happens to be off, you won't even notice it anyways. Remember this is DLSS 3, not 1 or 2 which had much more of those artifacts, it works on motion vectors.

https://www.youtube.com/watch?v=W6kCoonOpFI

Also, any artifacting in games can be fixed with DLSS 3 continuous updates. Behind the scenes is a NVIDIA’s AI Supercomputer running in training mode to learn how to best generate frames for various games. Currently Cyberpunk 2077, The Witcher 3: Wild Hunt, and Portal with RTX are being run and improved upon that.

For example, the NVIDIA DLSS 3 'Fine Wine' Update: Improved Image Quality, Reduced Ghosting, Better Edge Smoothness across multiple games.

https://www.youtube.com/watch?v=vMxJ2ZcMf2s

https://wccftech.com/nvidia-fps-increasing-dlss-3-tech-is-about-to-get-even-better-major-improvements-image-quality-in-games/

https://www.nvidia.com/en-us/geforce/news/ces-2023-rtx-dlss-game-updates/
En son Azza ☠ tarafından düzenlendi; 1 Şub 2023 @ 18:04
İlk olarak DeadBeat tarafından gönderildi:
If dlss gets better performance but looks like trash then I'm not going to use it. If dlss gives me better performance and I really can't tell the difference then hell ya I'm going to use it. It's not like you get a steam award for saying I didn't use fake frames today.
Depends what consider "trash" when compare to DLSS 1.0 ~ 2.0 it wasn't greatest, but it was a leap in right direction, as 1.0 was awful don't ever want to use it, while 2.0 made a major improvement in quality, after 2.0+ versions quality been getting better. The blur effect is something when you try to upscale let say 360p and up you notice flaws, but if do like 720p scale to 1080p display, or 1080p to 1440p display, or 1440p to 4k display you notice flaws are way less noticeable by a large portion, as it not trying fill in the blanks it knows the pixels it working with since not doing it from such a low res it barely make out from. Now DLSS 3.0 is exclusive to 4000 series sadly, and it's a large improvement as it doesn't replace whole frame, it replace only parts of the frame.

FSR 1.0 was god awful just as bad as DLSS 1.0 but worse, FSR 2.0 was step up big time, but wasn't good as DLSS 2.0, and but later version 2.1 of FSR got so much better it good enough like DLSS 2.1 at least, but still has some flaws.

Native vs 2.1 FSR vs DLSS 2.5
https://www.youtube.com/watch?v=CJW8VHtNUeU

Yes DLSS 2 is going to keep getting updates, and support as more newer version release, not sure when last DLSS 2 version will be, but it be dropped at some point in favor of DLSS 3.

Intel XESS came out good, but not winning any cookies, but did better than FSR.
En son Dr.Shadowds 🐉 tarafından düzenlendi; 1 Şub 2023 @ 18:38
Let me rephrase if the image quality isn't as good with it on then I won't use it. Also if I don't need the frames then I don't use it as well.
En son DeadBeat tarafından düzenlendi; 1 Şub 2023 @ 19:36
İlk olarak DeadBeat tarafından gönderildi:
Let me rephrase if the image quality isn't as good with it on then I won't use it. Also if I don't need the frames then I don't use it as well.
We're just explaining about stuff, not saying you need to use it, still up to you in the end regertless what you want to do for youself, some people like frames boost, some don't, not end of the world since it just an option you can choose to toggle on, or off.
< >
616 yorumdan 481 ile 495 arası gösteriliyor
Sayfa başına: 1530 50

Gönderilme Tarihi: 20 Oca 2023 @ 15:33
İleti: 616