Guldo 20 févr. 2024 à 14h25
3
THe RTX 5000 series is gonna be a flop and heres why.
If the leaks are true nvidia is coming out with a 16 pin power connector wich is devestating news for anyone that prefers nvidia over amd because that means these cards are going to be too big to fit in anyones case and over heating will also become an issue.

Now i can understand why amd is making no high end cards next gen because why even bother.

This definitly has to do with diminishing returns of shrinking the node as they arent getting the same uplift they used to so they have to make up the slack by making the cards bigger.
Écrit par PopinFRESH:
You are absolutely clueless. Smaller process notes doesn’t just equate to “more power”. Generally smaller process nodes result in less loss / power waste for a number of reasons. GPU power changes are a design choice balancing power:performance:die size. Hence why the same die on a smaller process node (e.g. a die shrink) generally use less power.

The actual graphics card for the 30 and 40 series is significantly smaller than earlier cards going back to the GTX 200 generation. My liquid cooled 3090 is tiny in comparison to my 2080, 1070, 970, 780 Ti etc. There are still a plethora of options across the product segments that don’t have large heatsinks on them.

“Is there a way to ditch the big heatsinks…” yeah, it is called a custom water cooled loop.
< >
Affichage des commentaires 76 à 90 sur 234
potato 22 févr. 2024 à 17h22 
i'm waiting for the rtxxx 6990
kitty a écrit :
waffleciocc a écrit :
RTX 4060 is a low end card. Uses AD107 silicon. What you wrote doesn't really disprove anything.
Perhaps you might want to re-read the words that I quoted there.
It uses a small die. It does not NEED a big cooler. Perhaps you might want to re-read the words that I quoted there.
kitty a écrit :
waffleciocc a écrit :
It uses a small die. It does not NEED a big cooler. Perhaps you might want to re-read the words that I quoted there.
I did read your comments. Even where you wrote your words you still aren't speaking of any specific video card model. Kenshiro isn't either. And neither is the OP either. So far almost everyone has just generalized stating "Video cards are getting bigger". No one said which video cards are getting bigger. If you want to nitpick at someone then pick a model, mention it, and discuss it. Otherwise people are free to pick anything we want.
Hahahahahhaahahahhaa

It's clearly obvious which one I was referring to. Apparently not obvious to you.
Guldo 22 févr. 2024 à 18h41 
waffleciocc a écrit :
kitty a écrit :
I did read your comments. Even where you wrote your words you still aren't speaking of any specific video card model. Kenshiro isn't either. And neither is the OP either. So far almost everyone has just generalized stating "Video cards are getting bigger". No one said which video cards are getting bigger. If you want to nitpick at someone then pick a model, mention it, and discuss it. Otherwise people are free to pick anything we want.
Hahahahahhaahahahhaa

It's clearly obvious which one I was referring to. Apparently not obvious to you.
They totaly ignore the fact thats flagships are getting bigger every generation. I think maybe its because the die on the flagship is getting bigger every generation because the die on the 4090 is huge.

ok i looked it up and im wrong because the die is slightly smaller than the 3090. so if its not an increase in die size making cards hotter and need bigger heatsinks is it clock speed.
Dernière modification de Guldo; 22 févr. 2024 à 18h43
Kenshiro a écrit :
waffleciocc a écrit :
Hahahahahhaahahahhaa

It's clearly obvious which one I was referring to. Apparently not obvious to you.
They totaly ignore the fact thats flagships are getting bigger every generation. I think maybe its because the die on the flagship is getting bigger every generation because the die on the 4090 is huge.
Not quite true.

780 Ti - 561mm2
980 Ti - 601mm2
1080 Ti - 471mm2
2080 Ti - 754mm2
3080 Ti [Gaming flagship] / 3090 Ti [productivity/gaming] - 628mm2
4090 - 609mm2

Quite the opposite actually.
Guldo 22 févr. 2024 à 18h50 
waffleciocc a écrit :
Kenshiro a écrit :
They totaly ignore the fact thats flagships are getting bigger every generation. I think maybe its because the die on the flagship is getting bigger every generation because the die on the 4090 is huge.
Not quite true.

780 Ti - 561mm2
980 Ti - 601mm2
1080 Ti - 471mm2
2080 Ti - 754mm2
3080 Ti [Gaming flagship] / 3090 Ti [productivity/gaming] - 628mm2
4090 - 609mm2

Quite the opposite actually.
yea i realised my mistake so i edited my comment.
Kenshiro a écrit :
ok i looked it up and im wrong because the die is slightly smaller than the 3090.

So if it's not an increase in die size making the cards hotter, and needing bigger heatsinks, is it clock speed?
They can't shrink the silicon enough, like in the past, to make it so you can have an efficient die that's significantly faster.

4090 draws 450w+. You can manually reduce the clock and make it draw 300w, but initially Ada was specced to go to 600w.
Dernière modification de spicy little monkey; 22 févr. 2024 à 19h03
Guldo 22 févr. 2024 à 18h58 
waffleciocc a écrit :
Kenshiro a écrit :
ok i looked it up and im wrong because the die is slightly smaller than the 3090.

So if it's not an increase in die size making the cards hotter, and needing bigger heatsinks, is it clock speed?
They can't shrink the silicon enough, like with the past, to make it so you can have an efficient die that's significantly faster.

4090 draws 450w+. You can manually reduce the clock and make it draw 300w, but initially Ada was specced to go to 600w.
so how does it gett hotter if its supposed to be more efficiant. was that other guy feeding me a bunch of baloney.
Kenshiro a écrit :
so how does it get hotter if its supposed to be more efficiant.
it's drawing 450-500 watts.

It is more efficient. That doesn't mean you can't push the hardware to make it inefficient.

An RTX 4080 typically draws 250-300w. You can make it draw 400w if you want it to be 5-10% faster.
Guldo 22 févr. 2024 à 19h04 
waffleciocc a écrit :
Kenshiro a écrit :
so how does it get hotter if its supposed to be more efficiant.
it's drawing 450-500 watts.

It is more efficient. That doesn't mean you can't push the hardware to make it inefficient.

An RTX 4080 typically draws 250-300w. You can make it draw 400w if you want it to be 5-10% faster.
its just poppin fresh said that 4090 only draws 450 watts and was implying it doesnt get hotter than the 3090.
Kenshiro a écrit :
waffleciocc a écrit :
it's drawing 450-500 watts.

It is more efficient. That doesn't mean you can't push the hardware to make it inefficient.

An RTX 4080 typically draws 250-300w. You can make it draw 400w if you want it to be 5-10% faster.
its just poppin fresh said that 4090 only draws 450 watts and was implying it doesnt get hotter than the 3090.
The 3090's GDDR6X definitely runs hotter than the 4090's.

3090 was typically not cooled properly.
Guldo 22 févr. 2024 à 19h09 
waffleciocc a écrit :
Kenshiro a écrit :
its just poppin fresh said that 4090 only draws 450 watts and was implying it doesnt get hotter than the 3090.
The 3090's GDDR6X definitely runs hotter than the 4090's.

3090 was typically not cooled properly.
to be honest i dont really know why bigger heatsinks are needed every generation. i was just assuming they drew more power lol.
Heat isn't just about power consumption, Ryzen can run much hotter at 50W than a much older chip because of the manufacturing process, the transistors are more densely packed on the die, so the hotspot is just inherently higher.

It's basically the same principle for GPU dies.
Dernière modification de r.linder; 22 févr. 2024 à 20h15
Karumati a écrit :
Dutchgamer1982 a écrit :
what we need is no gain in performance but a massive drop in poweruse.
and a correction in price would be nice too.

a 5090 consuming 250W costing 1400 euro including 21% vat, and performing 10% above a 4090,
a 5580ti consuming 250W, costing 900 euro including 21% vat and performing about the same as a 4090
a 5080 consuming 200W, costing 650 euro including 21% vat and performing about the same as a 4080super
a 5070 consuming 180W, costing 450 euro, including 21% vat and performing about the same as a 4080
a 5060 consuming 120W costing 250 euro, including 21% vat and performing about the same as a 4070
a 5050 consuming 80W, costing 190 euro, including 21% vat and performing about the same as a 4060


THATS the line up I want to see.,

those wattages are closer to what high end cards to low end cards for 15 years used to use.
and those prices are again the same as titans, x80ti, x80, etc. used to cost for over 15. years. with added inflation correction.

so that once more...
the low end gamers.. could get that 5060.. the mid end ones that 5070 or 5080.. the high ends that 5080ti or 5090..
and there would be a nice card that could be passively cooled as a succesor to the 1650ti
10% above 4090 is nothing. We need 30% more at least

450w is idiotic. top models always used 250w.
and prices even when corrected for inflation are insane.

as stated there used to be a normal pricepoint for x60 x70 x80 x80ti and x80/titan
and the best cards of those always used 220-250W... only the last 2 gens suddenly went insane with 450W.

likewise prices used to be more
150 euro x50
200 euro x60
350 euro x70
500 euro x80
700 euro x80ti
1100 euro titan/x90

when i add the inflation since prices were last like that... I get the above numbers..

what good is a card that cost 3000 euro and uses 7000 euro in 2 years in electricity..

a smaller performance gain.. but with prices back to what historicly they should be (corrected for inflation) and with a powerdraw to match.. will give users WAY more performance gain..
waffleciocc a écrit :
Kenshiro a écrit :
so how does it get hotter if its supposed to be more efficiant.
it's drawing 450-500 watts.

It is more efficient. That doesn't mean you can't push the hardware to make it inefficient.

An RTX 4080 typically draws 250-300w. You can make it draw 400w if you want it to be 5-10% faster.

and thats insane...

look right upto the 2xxx series every top gpu used about 250W... since like 2006..

and yet also every 2 generation.. you got double the performance, at SAME wattage..

yanking wattage up to 450W is no gain..

so a smaller performance gain or even decrease.. paired with a price drop back to what it always was upto the 1000 series.. and a voltage as it was upto the 2000 series fopr over 15 years.. is what we need..

what good is a 4000 euro 5090 that uses 600W.. or something insane like that...
Dernière modification de Dutchgamer1982; 22 févr. 2024 à 21h41
< >
Affichage des commentaires 76 à 90 sur 234
Par page : 1530 50

Posté le 20 févr. 2024 à 14h25
Messages : 234