Steam'i Yükleyin
giriş
|
dil
简体中文 (Basitleştirilmiş Çince)
繁體中文 (Geleneksel Çince)
日本語 (Japonca)
한국어 (Korece)
ไทย (Tayca)
Български (Bulgarca)
Čeština (Çekçe)
Dansk (Danca)
Deutsch (Almanca)
English (İngilizce)
Español - España (İspanyolca - İspanya)
Español - Latinoamérica (İspanyolca - Latin Amerika)
Ελληνικά (Yunanca)
Français (Fransızca)
Italiano (İtalyanca)
Bahasa Indonesia (Endonezce)
Magyar (Macarca)
Nederlands (Hollandaca)
Norsk (Norveççe)
Polski (Lehçe)
Português (Portekizce - Portekiz)
Português - Brasil (Portekizce - Brezilya)
Română (Rumence)
Русский (Rusça)
Suomi (Fince)
Svenska (İsveççe)
Tiếng Việt (Vietnamca)
Українська (Ukraynaca)
Bir çeviri sorunu bildirin
Your argument keeps killing itself. As you post things to try and look as if you've an edge, for it to be self destructive.
If you want bleeding edge flag ship tech, obviously it not gonna be cheap, nor will it be best dollar to performance when comes to other cards on the market. I bet you ignore this too, or say something even dumber either way, like you have been doing whenever someone point out your flaw in your bad arguments, proof of it self is within the last 490 posts of this discussion
https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+RTX+3090&id=4284
Side by side comparion of the problem, why shouldn't throw money at best cards as soon they release like a clown, and expect unrealistic things when never promise in the 1st place.
https://imgur.com/OTttX77
https://youtu.be/zjedLeVGcfE
I mean at this point any benchmark I look at I have to wonder before I even look at it...Is this using DLSS 3.0/fake frames (to help Nvidia look good) and does this have raytracing disabled (to help Nvidia look good). Because with raytracing LITERALLY being in the name of Nvidia's line of GPU's, they imply raytracing is practical. So for them to pitch any kind of benchmarks/performance benchmarks with a "raytracing disabled bros!" makes zero sense to me.
My advice take benchmark with grain of salt, compare them yourself as people will very likely do benchmark videos with the cards, and can even ask people that has them if they're willing to test things with said settings.
IMO I do think some benchmarks might include DLSS enable.
They don't expect us to pay hundreds of dollars for fake frames now do they? I have to admit, I'm highly surprised that people are actually buying these things and falling for this tech.
Nice job for Nvidia though, getting people to actually buy into this. It's not going to work on me though.
Test may need to do it multiple times to roll out any errors, and to get avg so know what going on incase something is a mishap.
AMD version to DLSS is FSR which is open source to all cards. Intel do have their own version call XESS.
The offical Nvidia benchmarks however had shown both - DLSS 3 enabled and disabled with different colours, green and lime.
If they where trying to be tricky and hiding that fact, then sure you might have a case. However, it's the other way around, the post owner is enabling Ray Tracing and complaining it can't do 4K at a whopping 120 FPS (when 60 FPS it doesn't anyway is considered standard for PC) with those advance settings which tank any other graphics card as well, when you just don't enable DLSS 3 with it. You could claim without Ray Tracing, the standard preloaded shaders and shadows are all faked too you know? What is the true definition of being faked?
https://www.youtube.com/watch?v=2VGwHoSrIEU
Ray Tracing had started off quite rough, but these days look quite realistic and give you much better immersion. You can seriously tell the difference between "fake" shadows and real-time ray tracing, even with DLSS 3 enabled over that to "fake" additional frames. I much prefer ray tracing in certain games, such as an adventure game or horror with lots of HDR quality darkness and light rays, fire or torches, etc. It makes a world of different. Saying that, I would consider disabling it on competitive fast pace games, which you don't need to sit around watching so much eye candy. It has it's pros and cons much like anything else.
In the case of OP, I do think his reasons for the thread are primarily out of angst towards Nvidia purely based on the price/lack of performance. I don't really mind so much that, although it probably isn't good. And to define "lack of performance" isn't really saying they suck because the cards are good, it's just that yeah....they really can't handle 4k at 120 fps with raytracing and max settings for titles that actually matter, like Cyberpunk 2077, WItcher 3 and stuff like that.
So my bottom line is that I do perceive trickery and deception on Nvidia's part and they get away with it. Examples of those are sprinkled thoughout this thread I think mostly through pages 4-8 if I'm not mistaken.
If you don't see it (like others haven't) that's fine.I really don't see a firm promise of 4k 120 hz, but if I did this thread would have been over long ago.
So I'm not really on anybody's side but if I had to choose to be with Nvidia right now or against in the context of this thread I would be against them. I am re-researching 7900 XT performance also because I am still considering which GPU along with other system components I am going to get (I might be doing this for the next five years straight lol).
https://www.youtube.com/watch?v=glz5B-4IlKE
^ 21:50 Not just doing RT, doing a maxed out PsychoRT version of Cyberpunk 2077 at 4K getting 133fps.
Again, why are people hellbent on discounting DLSS but insist on including RT. "They are advertising RayTracing, its in the name"... like they aren't also advertising DLSS as a competitive advantage or feature of the RTX cards? There are definitely areas of improvement for DLSS3 but it is a massive step forward toward accelerating the move to fully path traced renderers that will only improve over time with newer hardware advancements as well as improvements in the DL model.
Even without DLSS3 its getting 88fps on the PsychoRT version of Cyberpunk 2077 at 4K. The regular RT max settings at 4K also gets over 100fps without DLSS3.
It doesn't look like trash, in most cases it looks better than the original or the same. I haven't even seen it glitch or look off yet. DLSS 3 would likely only ever glitch upon multiple lighting sources with fast pace multi-directional movements. So the AI frame generation would have to get confused about the path direction of pixels and place pixels off position. Plus if you are getting 144 FPS out of it and one of those frames happens to be off, you won't even notice it anyways. Remember this is DLSS 3, not 1 or 2 which had much more of those artifacts, it works on motion vectors.
https://www.youtube.com/watch?v=W6kCoonOpFI
Also, any artifacting in games can be fixed with DLSS 3 continuous updates. Behind the scenes is a NVIDIA’s AI Supercomputer running in training mode to learn how to best generate frames for various games. Currently Cyberpunk 2077, The Witcher 3: Wild Hunt, and Portal with RTX are being run and improved upon that.
For example, the NVIDIA DLSS 3 'Fine Wine' Update: Improved Image Quality, Reduced Ghosting, Better Edge Smoothness across multiple games.
https://www.youtube.com/watch?v=vMxJ2ZcMf2s
https://wccftech.com/nvidia-fps-increasing-dlss-3-tech-is-about-to-get-even-better-major-improvements-image-quality-in-games/
https://www.nvidia.com/en-us/geforce/news/ces-2023-rtx-dlss-game-updates/
FSR 1.0 was god awful just as bad as DLSS 1.0 but worse, FSR 2.0 was step up big time, but wasn't good as DLSS 2.0, and but later version 2.1 of FSR got so much better it good enough like DLSS 2.1 at least, but still has some flaws.
Native vs 2.1 FSR vs DLSS 2.5
https://www.youtube.com/watch?v=CJW8VHtNUeU
Yes DLSS 2 is going to keep getting updates, and support as more newer version release, not sure when last DLSS 2 version will be, but it be dropped at some point in favor of DLSS 3.
Intel XESS came out good, but not winning any cookies, but did better than FSR.