Guldo 20. feb. 2024 kl. 14:25
3
THe RTX 5000 series is gonna be a flop and heres why.
If the leaks are true nvidia is coming out with a 16 pin power connector wich is devestating news for anyone that prefers nvidia over amd because that means these cards are going to be too big to fit in anyones case and over heating will also become an issue.

Now i can understand why amd is making no high end cards next gen because why even bother.

This definitly has to do with diminishing returns of shrinking the node as they arent getting the same uplift they used to so they have to make up the slack by making the cards bigger.
Oprindeligt skrevet af PopinFRESH:
You are absolutely clueless. Smaller process notes doesn’t just equate to “more power”. Generally smaller process nodes result in less loss / power waste for a number of reasons. GPU power changes are a design choice balancing power:performance:die size. Hence why the same die on a smaller process node (e.g. a die shrink) generally use less power.

The actual graphics card for the 30 and 40 series is significantly smaller than earlier cards going back to the GTX 200 generation. My liquid cooled 3090 is tiny in comparison to my 2080, 1070, 970, 780 Ti etc. There are still a plethora of options across the product segments that don’t have large heatsinks on them.

“Is there a way to ditch the big heatsinks…” yeah, it is called a custom water cooled loop.
< >
Viser 196-210 af 234 kommentarer
Blightor 1. juli 2024 kl. 20:16 
Oprindeligt skrevet af PopinFRESH:
You are absolutely clueless. Smaller process notes doesn’t just equate to “more power”. Generally smaller process nodes result in less loss / power waste for a number of reasons. GPU power changes are a design choice balancing power:performance:die size. Hence why the same die on a smaller process node (e.g. a die shrink) generally use less power.

The actual graphics card for the 30 and 40 series is significantly smaller than earlier cards going back to the GTX 200 generation. My liquid cooled 3090 is tiny in comparison to my 2080, 1070, 970, 780 Ti etc. There are still a plethora of options across the product segments that don’t have large heatsinks on them.

“Is there a way to ditch the big heatsinks…” yeah, it is called a custom water cooled loop.
Its a great post, however its probably worthwhile explaining why its more efficient/more powerful.

It just comes down to pushing electrons. Lets say you have a length of string with 5 transistors (the actual things that do stuff) along it, it will cost a certain amount power to push electrons across the length of that string. If you can suddenly have the same number of transistors packed into half that length of string, well its going to cost you half the amount of power.

And as to your point about die size, so yes if you keep the same die size (length of string), but are now packing 10 transistors instead of 5, its essentially using the same power to push the electrons, but giving you 'twice' the performance (twice the number of things that do stuff).
suzu 1. juli 2024 kl. 23:52 
123
Tonepoet 2. juli 2024 kl. 0:21 
Oprindeligt skrevet af _I_:
Oprindeligt skrevet af Tonepoet:

Eh? This looks like 2x8 to me.[www.pcinvasion.com]
thats the new problematic connector
other gpus that need around 600w can use 3x pci-e 6+2 connectors

It's not just the new connector. It's an an adapter, likely for a modular power supply. Look at the other side. It's 8x2 to 12vhpwr.

If 8x2 can't carry the same amount of current as 12vhpwr, how can an 8x2 to 12hpwr adapter exist? One end would bottleneck the other, wouldn't it? That was my line of reasoning.

Oprindeligt skrevet af PopinFRESH:
Oprindeligt skrevet af Tonepoet:
...Eh? This looks like 2x8 to me

"Image credit: Corsair"

https://www.nvidia.com/en-us/geforce/graphics-cards/compare/

Oprindeligt skrevet af NVIDIA:
Required Power Connectors
[…]

I'll take the manufacture of the things word over Corsair marketing department making stupid pictures....

It’s not just an isolated marketing stunt. It’s an actual product they sell. Others sell it too: Corsair[www.corsair.com], Superflower[www.super-flower.com.tw], Be Quiet[www.bequiet.com] and Seasonic[seasonic.com] all make 8x2 600 watt to the 12VHPWR adapters for their high end modular power supplies. 600 watts is the maximum rating that connector is presently rated to carry[videocardz.com]. If those cables aren’t capable of carrying 600 watts, they should be rated for 450, 300 or 150 watts.

The power supply companies don’t have much at stake regarding team red vs team green. They’re as close to a neutral party as we’re going to get as such. If they are biased, what they do have skin in the game for is persuading you that you need to buy a new power supply with a new connector, so if anything, you’d expect them to make the old standard connectors as unappealing as possible compared to the 12vhpwr connector.

Also, according to A.M.D. the reference spec. 7900 xtx is a 355 watt card, and that uses 2x8 pins standard[www.techpowerup.com]. According to nvidia, the 4080 and 4080 super are only rated as 320 watt cards and Nvidia’s telling us we need a new connector with thinner wires and the same number of pins, and somehow that’s going to give us the current we need to carry more power than the 8x2 connector solution did. I’m usually not looking to buy these high end cards so I didn’t know there were some with 3, but I very much doubt this is necessary from an engineering perspective.

I am also kind of doubting it’s a space thing. Fact of the matter is that a 7900 xtx is smaller than a 4080[www.dexerto.com] for a comparable, if not marginally weaker card[www.digitaltrends.com] so if anybody needed to cram more power into less space, it’s team red, but they didn’t.

If anything is a gimicky marketing stunt, it is the 12VHPWR connector itself, and I do not know why anybody would lend Nvidia their trust regarding this particular matter over what we have been shown with other brands, given that they are the only ones with melty connectors.

The only legitimate reason I can think of for this is if they were trying to obsolete something like 20 year old underbuilt power supplies for safety reasons, but this massively backfires when brand new product is shown to be a literal fire hazzard and leaves an indelible bad first impression of the connector on your marketplace.

I don’t understand why anybody would legitimately trust Nvidia over anybody else on this nonsense. If anybody’s pulling a marketing stunt it’s them, as proven by the melted cables on their side that A.M.D. never had to worry about. Even if I were willing to give Nvidia the benefit of the doubt regarding the potential needs of the 4080, a 4070 needs 3 when the 7900xtx needs two is nothin’ but puffin’ smoke.

Also, just so that what I say isn’t misconstrued, I don’t normally buy graphics cards in so high of a price bracket. I do see there are some high end stock overclocked cards with three connectors on them, and I don’t mean to suggest you should try running off of just two if you have three. They might be running more current through thinner conductors or something.
Sidst redigeret af Tonepoet; 2. juli 2024 kl. 2:10
skOsH♥ 2. juli 2024 kl. 16:39 
What if the gpu's get so big, that they become larger than the mobo?

You would need a case that's about 0.45m^3
Sidst redigeret af skOsH♥; 2. juli 2024 kl. 16:40
Iron Knights 2. juli 2024 kl. 16:49 
All tech. stalls and we hit a big wall now, with the breach of 4K by the 4000 & 7000 series cards from both makers. With 10-32gb VRAM there is no simulator they cannot handle in high res..
So it is expected the next generations to be refurbished/upgraded 3000 & 6000 series. Claiming to do 2x the stuff on paper, but do only like 10% in the real world.
I think we hit the wall of what is expected of computers to do in general, thus new improvements being unnecessary. But as always with the end of miniaturization, you have to go big to go small again. A next step in computing will be much larger PCs working slightly differently than current 0-1 gates. Embedded full physics & biology chips and other complex formulas, object oriented rendering, no more frames, no more skin on frames, etc...
Skkooomer Lord 2. juli 2024 kl. 17:27 
Oprindeligt skrevet af Iron Knights:
All tech. stalls and we hit a big wall now, with the breach of 4K by the 4000 & 7000 series cards from both makers. With 10-32gb VRAM there is no simulator they cannot handle in high res..
So it is expected the next generations to be refurbished/upgraded 3000 & 6000 series. Claiming to do 2x the stuff on paper, but do only like 10% in the real world.
I think we hit the wall of what is expected of computers to do in general, thus new improvements being unnecessary. But as always with the end of miniaturization, you have to go big to go small again. A next step in computing will be much larger PCs working slightly differently than current 0-1 gates. Embedded full physics & biology chips and other complex formulas, object oriented rendering, no more frames, no more skin on frames, etc...
YAHHHH We has enough for 9K right now.
Bad 💀 Motha 3. juli 2024 kl. 3:25 
People act like 4K is new. I was playing some games at 4K as far back as 2016. Sure the FPS wasn't 60+ but still. For some games it was fine.

It is nice to be able to enjoy games like Red Dead 2 and such in 4K/60+ though with modern stuff. And with console folks mostly using 4K TVs now, 4K gaming has definitely come a long way. But definitely much room for improvement sure. Especially as future games and game engines become more demanding.
Sidst redigeret af Bad 💀 Motha; 3. juli 2024 kl. 3:25
C1REX 3. juli 2024 kl. 3:34 
Oprindeligt skrevet af Bad 💀 Motha:
It is nice to be able to enjoy games like Red Dead 2 and such in 4K/60+ though with modern stuff. And with console folks mostly using 4K TVs now, 4K gaming has definitely come a long way. But definitely much room for improvement sure. Especially as future games and game engines become more demanding.



I think 4K is brilliant even today. Native for older or lighter games and with DLSS for everything else.
DLSS is close to useless at 1080 and meh at 1440p but it's like magic at 4K. All options are viable as you can upscale from 1080p or 1440p when on a 1440p monitor the best you can do is DLSS quality that upscales from 960p.
4K can also use dynamic resolution when it's useless on 1080 and 1440p monitors that can't really handle non native resolutions.
metamec 3. juli 2024 kl. 3:58 
I'm surprised by some of the pushback DLSS sometimes gets. Yeah, often it *should* be unnecessary because games could be better optimised, but in the meantime it makes a world of difference. I'd sooner play with DLSS enabled (if needed) on a large 4K screen than natively on a lower res screen.

I don't seem to need it that much, but it's definitely a game changer when I do.
C1REX 3. juli 2024 kl. 4:50 
Oprindeligt skrevet af metamec:
I'm surprised by some of the pushback DLSS sometimes gets. Yeah, often it *should* be unnecessary because games could be better optimised, but in the meantime it makes a world of difference. I'd sooner play with DLSS enabled (if needed) on a large 4K screen than natively on a lower res screen.

I don't seem to need it that much, but it's definitely a game changer when I do.

It can be worth even on a top end GPU if it can significantly improve FPS, response time and motion blur. And upscaling works best at 4K. Actually I think the strongest GPUs benefit the most as they can do 4K + DLSS quality and reach 120- 144fps when slower GPU are stuck at lower resolutions where DLSS doesn’t look that good anyway.
I personally think that DLSS is the biggest advantage of going Nvidia over AMD. More important for me than Ray tracing performance. At least for now.
metamec 3. juli 2024 kl. 5:11 
Oprindeligt skrevet af C1REX:

It can be worth even on a top end GPU if it can significantly improve FPS, response time and motion blur. And upscaling works best at 4K. Actually I think the strongest GPUs benefit the most as they can do 4K + DLSS quality and reach 120- 144fps when slower GPU are stuck at lower resolutions where DLSS doesn’t look that good anyway.
I personally think that DLSS is the biggest advantage of going Nvidia over AMD. More important for me than Ray tracing performance. At least for now.

Agreed. I'm using a 4070 Ti and it surprised me that I still felt the need for DLSS with some games. I'm already keeping an eye out for my next GPU, but it's definitely going to be Nvidia, just in case I feel like I still need DLSS.

I'm not sure if FSR works better on AMD cards than Nvidia, but where both DLSS and FSR are available in a game, I always feel like DLSS looks and performs better.
Sidst redigeret af metamec; 3. juli 2024 kl. 5:12
Guldo 3. juli 2024 kl. 9:35 
Oprindeligt skrevet af metamec:
Oprindeligt skrevet af C1REX:

It can be worth even on a top end GPU if it can significantly improve FPS, response time and motion blur. And upscaling works best at 4K. Actually I think the strongest GPUs benefit the most as they can do 4K + DLSS quality and reach 120- 144fps when slower GPU are stuck at lower resolutions where DLSS doesn’t look that good anyway.
I personally think that DLSS is the biggest advantage of going Nvidia over AMD. More important for me than Ray tracing performance. At least for now.

Agreed. I'm using a 4070 Ti and it surprised me that I still felt the need for DLSS with some games. I'm already keeping an eye out for my next GPU, but it's definitely going to be Nvidia, just in case I feel like I still need DLSS.

I'm not sure if FSR works better on AMD cards than Nvidia, but where both DLSS and FSR are available in a game, I always feel like DLSS looks and performs better.
I have a 7800xt and i looked at benchmarks and its hit and miss. sometimes the 7800xt is faster than a 4070ti by like 10 frames and then sometimes its slower than the 4070ti by a good bit.

Its most likely driver related. but who knows if these people that ran these benchmarks even updated their amd drivers first.
Mr White 3. juli 2024 kl. 9:43 
I would like to see the sales figures of High End cards from both firms. I suspect most sales come from mid-range. As for the RTX 5000 Series being a flop they only be a flop if there no better than the current GPU.

I mean look at the RTX 4000 series they are no better than the RTX 3000 plus there naming mess. I mean you have 8GB 4060 AND 4060TI and then the 4060TI 16GB versions
chosendarksider 11. aug. 2024 kl. 6:10 
He is a 4090 owner !!!💥🤣🤣dont worry i will understand .
C1REX 11. aug. 2024 kl. 9:38 
Oprindeligt skrevet af Little Moon:
I would like to see the sales figures of High End cards from both firms. I suspect most sales come from mid-range.
Sales numbers are likely worse than for 4070 but these numbers are less important than profit. Share holders also love high margin products.
< >
Viser 196-210 af 234 kommentarer
Per side: 1530 50

Dato opslået: 20. feb. 2024 kl. 14:25
Indlæg: 234