Installer Steam
log på
|
sprog
简体中文 (forenklet kinesisk)
繁體中文 (traditionelt kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tjekkisk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spanien)
Español – Latinoamérica (spansk – Latinamerika)
Ελληνικά (græsk)
Français (fransk)
Italiano (italiensk)
Bahasa indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (hollandsk)
Norsk
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasilien)
Română (rumænsk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (Vietnamesisk)
Українська (ukrainsk)
Rapporter et oversættelsesproblem
It just comes down to pushing electrons. Lets say you have a length of string with 5 transistors (the actual things that do stuff) along it, it will cost a certain amount power to push electrons across the length of that string. If you can suddenly have the same number of transistors packed into half that length of string, well its going to cost you half the amount of power.
And as to your point about die size, so yes if you keep the same die size (length of string), but are now packing 10 transistors instead of 5, its essentially using the same power to push the electrons, but giving you 'twice' the performance (twice the number of things that do stuff).
It's not just the new connector. It's an an adapter, likely for a modular power supply. Look at the other side. It's 8x2 to 12vhpwr.
If 8x2 can't carry the same amount of current as 12vhpwr, how can an 8x2 to 12hpwr adapter exist? One end would bottleneck the other, wouldn't it? That was my line of reasoning.
It’s not just an isolated marketing stunt. It’s an actual product they sell. Others sell it too: Corsair[www.corsair.com], Superflower[www.super-flower.com.tw], Be Quiet[www.bequiet.com] and Seasonic[seasonic.com] all make 8x2 600 watt to the 12VHPWR adapters for their high end modular power supplies. 600 watts is the maximum rating that connector is presently rated to carry[videocardz.com]. If those cables aren’t capable of carrying 600 watts, they should be rated for 450, 300 or 150 watts.
The power supply companies don’t have much at stake regarding team red vs team green. They’re as close to a neutral party as we’re going to get as such. If they are biased, what they do have skin in the game for is persuading you that you need to buy a new power supply with a new connector, so if anything, you’d expect them to make the old standard connectors as unappealing as possible compared to the 12vhpwr connector.
Also, according to A.M.D. the reference spec. 7900 xtx is a 355 watt card, and that uses 2x8 pins standard[www.techpowerup.com]. According to nvidia, the 4080 and 4080 super are only rated as 320 watt cards and Nvidia’s telling us we need a new connector with thinner wires and the same number of pins, and somehow that’s going to give us the current we need to carry more power than the 8x2 connector solution did. I’m usually not looking to buy these high end cards so I didn’t know there were some with 3, but I very much doubt this is necessary from an engineering perspective.
I am also kind of doubting it’s a space thing. Fact of the matter is that a 7900 xtx is smaller than a 4080[www.dexerto.com] for a comparable, if not marginally weaker card[www.digitaltrends.com] so if anybody needed to cram more power into less space, it’s team red, but they didn’t.
If anything is a gimicky marketing stunt, it is the 12VHPWR connector itself, and I do not know why anybody would lend Nvidia their trust regarding this particular matter over what we have been shown with other brands, given that they are the only ones with melty connectors.
The only legitimate reason I can think of for this is if they were trying to obsolete something like 20 year old underbuilt power supplies for safety reasons, but this massively backfires when brand new product is shown to be a literal fire hazzard and leaves an indelible bad first impression of the connector on your marketplace.
I don’t understand why anybody would legitimately trust Nvidia over anybody else on this nonsense. If anybody’s pulling a marketing stunt it’s them, as proven by the melted cables on their side that A.M.D. never had to worry about. Even if I were willing to give Nvidia the benefit of the doubt regarding the potential needs of the 4080, a 4070 needs 3 when the 7900xtx needs two is nothin’ but puffin’ smoke.
Also, just so that what I say isn’t misconstrued, I don’t normally buy graphics cards in so high of a price bracket. I do see there are some high end stock overclocked cards with three connectors on them, and I don’t mean to suggest you should try running off of just two if you have three. They might be running more current through thinner conductors or something.
You would need a case that's about 0.45m^3
So it is expected the next generations to be refurbished/upgraded 3000 & 6000 series. Claiming to do 2x the stuff on paper, but do only like 10% in the real world.
I think we hit the wall of what is expected of computers to do in general, thus new improvements being unnecessary. But as always with the end of miniaturization, you have to go big to go small again. A next step in computing will be much larger PCs working slightly differently than current 0-1 gates. Embedded full physics & biology chips and other complex formulas, object oriented rendering, no more frames, no more skin on frames, etc...
It is nice to be able to enjoy games like Red Dead 2 and such in 4K/60+ though with modern stuff. And with console folks mostly using 4K TVs now, 4K gaming has definitely come a long way. But definitely much room for improvement sure. Especially as future games and game engines become more demanding.
I think 4K is brilliant even today. Native for older or lighter games and with DLSS for everything else.
DLSS is close to useless at 1080 and meh at 1440p but it's like magic at 4K. All options are viable as you can upscale from 1080p or 1440p when on a 1440p monitor the best you can do is DLSS quality that upscales from 960p.
4K can also use dynamic resolution when it's useless on 1080 and 1440p monitors that can't really handle non native resolutions.
I don't seem to need it that much, but it's definitely a game changer when I do.
It can be worth even on a top end GPU if it can significantly improve FPS, response time and motion blur. And upscaling works best at 4K. Actually I think the strongest GPUs benefit the most as they can do 4K + DLSS quality and reach 120- 144fps when slower GPU are stuck at lower resolutions where DLSS doesn’t look that good anyway.
I personally think that DLSS is the biggest advantage of going Nvidia over AMD. More important for me than Ray tracing performance. At least for now.
Agreed. I'm using a 4070 Ti and it surprised me that I still felt the need for DLSS with some games. I'm already keeping an eye out for my next GPU, but it's definitely going to be Nvidia, just in case I feel like I still need DLSS.
I'm not sure if FSR works better on AMD cards than Nvidia, but where both DLSS and FSR are available in a game, I always feel like DLSS looks and performs better.
Its most likely driver related. but who knows if these people that ran these benchmarks even updated their amd drivers first.
I mean look at the RTX 4000 series they are no better than the RTX 3000 plus there naming mess. I mean you have 8GB 4060 AND 4060TI and then the 4060TI 16GB versions