Установить Steam
войти
|
язык
简体中文 (упрощенный китайский)
繁體中文 (традиционный китайский)
日本語 (японский)
한국어 (корейский)
ไทย (тайский)
Български (болгарский)
Čeština (чешский)
Dansk (датский)
Deutsch (немецкий)
English (английский)
Español - España (испанский)
Español - Latinoamérica (латиноам. испанский)
Ελληνικά (греческий)
Français (французский)
Italiano (итальянский)
Bahasa Indonesia (индонезийский)
Magyar (венгерский)
Nederlands (нидерландский)
Norsk (норвежский)
Polski (польский)
Português (португальский)
Português-Brasil (бразильский португальский)
Română (румынский)
Suomi (финский)
Svenska (шведский)
Türkçe (турецкий)
Tiếng Việt (вьетнамский)
Українська (украинский)
Сообщить о проблеме с переводом
There is a slim chance that RDNA 5 releases in 2025 but I wouldn't get your hopes up
Absolutely.
If you look at steam forums, people in majority have never been the brightest bulbs and complain about performance or crash problem with their new rigs with 4000 RTX series GFx cards which are focused on the "ray/path tracing" marketing argument mostly.
And one card like this costs as much as a whole pc.......
So while they complain, smart people who did not enter the Hype trains, will play with 3060/3080/3090 cards and actually have the same excellent experience people with 4000 series have. The interesting part is that some game mods can imitate path tracing effect for 3000 series and VOILA.....no need for 4000RTX anymore.
It's the same like shouting " I have only 30 FPS!!!! WHY ???? ".......... dude you play in 4K, I play in HD resolution, it's enough for me, don't need a huge screen and I got my 60+ fps at max graphics....
This is still subjective. There are always going to be people who buy expensive graphics cards that cost and arm and leg. The same as there are always going to be people who buy expensive watches, phones, clothes, cars and houses. Just because someone buys a 4090 doesn't mean they are not smart. Reading that just seems like you're a little be jealous of people who own them. I know I am.
It is a little bit funny though when users with very high graphics cards complain about poor performance. Probably because some of these users (not all of them) insist on max settings for everything and will not even consider optimisation. It's amazing how you can change a few settings in a game that only slightly degrade image quality, but massively increases performance. That's what 'SMART' people do.
lol show us on the doll where Jensen touched you.
This is what AMD fanboys have been saying since GCN... Still hasn't materialized. AMD GPUs aren't terrible by any means and some models are good value if you don't care about ray tracing at all. However, RT quality and performance difference between AMD & Intel vs NVIDIA GPUs they are both significantly behind; objectively, as has been continually demonstrated by any reputable reviewer using 3rd party benchmarks. AMD's bigger concern right now should be Intel because they are making some significant strides in driver maturation and once they get over the hump of "game-breaking driver support" Arc will very likely undermine the bulk of AMD's good value options.
They can make the die bigger, add faster memory, more memory bus, higher power target and maybe some other things to make the GPU faster.
It’s a halo product but its performance will affect the perception of the whole series even if lower tier cards won’t be as impressive.
https://www.techpowerup.com/319253/12v-2x6-h-standard-touted-to-safely-deliver-675-w
According to this article "AMD has indicated that it is considering an adoption of Gen 5 H++ in the future" I couldn't find any more info on it but would make things very interesting at the very top end of the gpu market if and when they do.
Given the power connection is from the PCI-SIG it should be obvious they'd (AMD) eventually adopt it. Not sure what they mean by "Gen5 H++" since that connector is the refreshed design in the base PCIe Gen 6 specification. Obviously they can still implement it for a card that is just using PCIe Gen 5 buss; just like NVIDIA did for the 40 Super series.
I also don't think NVIDIA will need to push TBP up to 675W to be able to achieve a gen-on-gen improvement in the range you're speculating about. While the speculative rumors is that NVIDIA would be using TSMC N3 for the 50-series; TSMC's roadmap and production ramp expectation for N2 would be in-line for what is now expected as the launch window for the 50-series. N2 would be a significantly more efficient process, with substantially denser transistors and backside power delivery, than what the current 40-series on N4; and would allow that level of performance increase (~70% improvement) with the same power-level.
Even if they move to the TSMC N3P process that is still a 5% performance increase at 5%-10% decrease in power over N3E which itself is an 18% performance increase at -32% power decrease over their N5 process; which the N4 process that the 40-series uses is about an 11% increase over N5. However, the real gains there for NVIDIA is the ~1.64x transistor density. They'd still be able to balance the addition of a significant amount of transistors, in the same die area, at a similar power level.