De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 3.43
will 5000 series continue the insane tdp march?
those who have red some of my posts before (on my main steam account most likely)

will have by now know my mantra.

upto the 2xxx series of nvidea cards... the following always has been true :

x80ti and x90/titan used 230-250W
x80 used 180-200w
x70 uses 120-130W
x60 used 100w
x50 used 80w

this was true for over 20! years.

it was the 3xxx series that broke this trend with it's insane 350w card.. and 4xxx series made it much worse...

this high powerdraw came just as energyprices in europe became insane and is a MAYOR reason people like me have not upgraded yet.
and unless the next nvidea series will bring those tdp levels back to normal levels.. that won't change..

you see I generally pay 4-5 times for a videocard.. at 250w.. only once for purchasing it... all the other times for using it...
if nvidea effectively doubles the tdp for their entire lineup (and they have effectively) than they doubled the price.. and thats on TOP of the doubling and tripling of the prices they sell the things for they also did.

I personally not mind the new powercable.. sure it melts.. sure it needs a new custom powersupply as adaptercables are finnicy like mad... but they sure work out those kinks eventually..
I'm far more worried for the reason they NEED it... it likely means the 5xxx series will draw even grazier amounts of power.. instead of hearing users like mine demand : sell us a product like the 4090 but with 250W tdp or less

now there are more historical truth's that were true for 20+ years..
in every nvidea lineup the x70 was the base model.. and if we wave away all the stupid nonsicel super and ti models execept the x80ti that always was the best non titan card (that in some lineups instead of titan is named x90)

x50 0.2x the fps, 0.4x the price, 0.6x the tdp
x60 0.4x the fps, 0.6x the price, 0.8x the tdp
x70 (base model)
x80 1.33 x the fps, 1x the price, 1.5x the tdp
x80ti 1.66x the fps, 2x the price, 2x the tdp
x90 2x the fps, 3x the price, 2.5x the tdp

and also that for each nect generation there is about
a 40% gain in performance for ALL the cards, while the price and tdp staus the sam,e

this was why you upgraded every 3 years your pc.
1.4x1.4 = 1.96 or, roughly every 2 nvidea generations performance doubled while price and more important tdp stayed the same.

but this has been broken..
if I compare my current 2080ti's to the 4xxx lineup.

an 4090 does indeed perform twice an 2080ti. but it's insane powerdraw of 450w vs the 250w of my current card means the performance per watt gain is barely noticeble.

there was also this thing where a x70 of this generation would perform equal to an x80ti 2 generations back, at halve the tdp... again broken.
yes an 4070 does perform about the same as an x80ti.. but at 220w vs 250w.. it also has nearly the same powerdraw.. again barely any performance per watt gain.

hence why unlike for 20 years.. I have not upgraded yet....

and with rumours that the flagship of the 5000 series might draw 600w tdp.. and all models using the new connector.. things might only get worse...

***and ofcourse prices have been messed up too.. prices for over 20 years were..

x50 = 100 euro
x60 = 200 euro
x70 = 350 euro
x80 = 500 euro
x80ti = 700 euro
x90/titan = 1100 euro

but this was broken already with the 2xxx series.. I paid 1600 euro for my 2080ti.. while my 980ti was still 700 euro... an insane pricespike.. and with the 4090 at 2000 euro.. it only gets worse..
you now pay former highend card prices for owning an x70...

But thats totally irrelevant vs the tdp creep...
for as already stated MOST the money you pay for a gpu is NOT upfront.. it is in your utility bill.
and while it may not be an issue for kids living with their parents or students who rent dorms with ulitily included.. for any adult who pays their own bills it most defitly is..
Sist redigert av De Hollandse Ezel; 18. juni 2024 kl. 3.48
< >
Viser 115 av 19 kommentarer
De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 3.50 
so... if this creep continues... and nvidea keeps ranking up the wattages by 30% each generation.. while performance still only increases by 40%^per generation..

it be a LONG time before "twice the performance at same tdp" ever becomes true again.
personally i won't be switching my 4090 for a 5090 because it won't be worth it for just 40% more fps
in my opinion it's only worth upgrading for some big gains and not some mid crap but i get your point for one part but the other i don't where you say the 4xxx 350w part
my 4090 does not really go above 250w depending on the game
and you should lock your fps maybe?
Sist redigert av 󠀡󠀡󠀡󠀡⁧⁧Kei; 18. juni 2024 kl. 3.54
Andrius227 18. juni 2024 kl. 4.18 
I will buy an rtx5080 and if it uses more power than my 4080, i will simply limit the power to ~90% or so. Did that when i had a 3090.
_I_ 18. juni 2024 kl. 8.17 
they are pretty much running into a wall of performance per watt
the only way to pass it is to increase power and cooling to get better performance from each cores
rawWwRrr 18. juni 2024 kl. 9.29 
Soon you'll need a dedicated PSU just for the card, along with a gas generator to offset the power usage.
Illusion of Progress 18. juni 2024 kl. 10.23 
There is no rule that past norms must remain the same.

It's absolutely fun to discuss the changes in trends for discussion sake on how the market is changing, sure. I do that myself.

But there is no "standard" for how things will advance, and as it is, it does seem like chips (CPUs and GPUs) have been pushing power draw up in order to achieve their gains lately.

What will happen for that next generation? Time will tell.
Opprinnelig skrevet av De Hollandse Ezel:
those who have red some of my posts before (on my main steam account most likely)

will have by now know my mantra.
I'm familiar with them, and they often seems revisionist of what really happened.

Case in point....
Opprinnelig skrevet av De Hollandse Ezel:
x50 0.2x the fps, 0.4x the price, 0.6x the tdp
x60 0.4x the fps, 0.6x the price, 0.8x the tdp
x70 (base model)
x80 1.33 x the fps, 1x the price, 1.5x the tdp
x80ti 1.66x the fps, 2x the price, 2x the tdp
x90 2x the fps, 3x the price, 2.5x the tdp
These performance spreads were definitely not typical. This is way off.

In what world was the x60 typically less than half the performance of the x70, but over half the cost? That would suggest the x60 has a worse value than the x70 when the reality was the opposite; the x60 has almost always, if not always, been the best value in a generation.

Using TechPowerUp's relative performance and comparing the various tiers from within generations to see what trends emerge, it seems it was almost always more like this...

x80 is to be considered 100% of the performance of a given generation because it was traditionally the flagship and released first. So if there's a "baseline" for performance of a generation, this is often it.

x60 was typically somewhere around two-thirds (~66%) the performance of the generation baseline. For some examples, the GTX 1060 was 62% of the GTX 1080, and going back to both Kepler generations and Fermi, they were even closer. The GTX 660 and GTX 760 were 70%+ (!) of the respective x80 performance, and the GTX 560 Ti was almost three quarters of the GTX 580!

x50 is the biggest wild card as it varies a lot. Sometimes it's as low as half of the x60 (Kepler and Pascal), so it would therefore be around one-third (33%) of the performance of the generation baseline. Other times it's not so much slower than the x60 as is up to 80% of the performance of the x60 (Maxwell generation).

Then the Ti models would typically come later as refreshes and be a bit faster than their base model.

So if the x80 isn't even typically double the x60, then what alternate reality are you living in where the x70 could be?

I'm also not sure why you listed the x90 for historical reference because other than the dual GPU models of the past, it was only introduced last generation. And since the RTX 3090 was little more than an RTX 3080 Ti with more VRAM, I'm not sure why you're putting such a performance spread between those tiers too. It was certainly not twice the performance of the x70 either.
De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 10.43 
Opprinnelig skrevet av Illusion of Progress:
There is no rule that past norms must remain the same.

It's absolutely fun to discuss the changes in trends for discussion sake on how the market is changing, sure. I do that myself.

But there is no "standard" for how things will advance, and as it is, it does seem like chips (CPUs and GPUs) have been pushing power draw up in order to achieve their gains lately.

What will happen for that next generation? Time will tell.
Opprinnelig skrevet av De Hollandse Ezel:
those who have red some of my posts before (on my main steam account most likely)

will have by now know my mantra.
I'm familiar with them, and they often seems revisionist of what really happened.

Case in point....
Opprinnelig skrevet av De Hollandse Ezel:
x50 0.2x the fps, 0.4x the price, 0.6x the tdp
x60 0.4x the fps, 0.6x the price, 0.8x the tdp
x70 (base model)
x80 1.33 x the fps, 1x the price, 1.5x the tdp
x80ti 1.66x the fps, 2x the price, 2x the tdp
x90 2x the fps, 3x the price, 2.5x the tdp
These performance spreads were definitely not typical. This is way off.

In what world was the x60 typically less than half the performance of the x70, but over half the cost? That would suggest the x60 has a worse value than the x70 when the reality was the opposite; the x60 has almost always, if not always, been the best value in a generation.

Using TechPowerUp's relative performance and comparing the various tiers from within generations to see what trends emerge, it seems it was almost always more like this...

x80 is to be considered 100% of the performance of a given generation because it was traditionally the flagship and released first. So if there's a "baseline" for performance of a generation, this is often it.

x60 was typically somewhere around two-thirds (~66%) the performance of the generation baseline. For some examples, the GTX 1060 was 62% of the GTX 1080, and going back to both Kepler generations and Fermi, they were even closer. The GTX 660 and GTX 760 were 70%+ (!) of the respective x80 performance, and the GTX 560 Ti was almost three quarters of the GTX 580!

x50 is the biggest wild card as it varies a lot. Sometimes it's as low as half of the x60 (Kepler and Pascal), so it would therefore be around one-third (33%) of the performance of the generation baseline. Other times it's not so much slower than the x60 as is up to 80% of the performance of the x60 (Maxwell generation).

Then the Ti models would typically come later as refreshes and be a bit faster than their base model.

So if the x80 isn't even typically double the x60, then what alternate reality are you living in where the x70 could be?

I'm also not sure why you listed the x90 for historical reference because other than the dual GPU models of the past, it was only introduced last generation. And since the RTX 3090 was little more than an RTX 3080 Ti with more VRAM, I'm not sure why you're putting such a performance spread between those tiers too. It was certainly not twice the performance of the x70 either.

we can disagree.. and well I might do the x60 for the very old series a bit dirty.. for basicly I never owned anythging below an x70.. nor did any gamer around me.. x60 cards were looked down on as "budgetcrap" x70's were the staple menu for all.. even x80 non ti were often seen as "just high midend"
-> but we can look up gflops for each chip.. and you see it does matches my list pretty neatly.

and yeah I left x50 out.. most gen did not have one.. and when they did.. it generally was an "for office use" card aka when you had a cpu without grafics but needed nothing fancy.

as for the x90.. it did excist in the older generations.. before titans became a thing (not every generation had one but some had).. and I tend to see titans as basicly rebranded x90's or the reverse x90s as rebranded titans.. as the card above the x80ti. which always was the top card.

those titans were usually much more expensive.. (back when 1000-1100 euro for a gpu seemed like grazy much).. and thus more people stuck to the x80ti which was the best of the normally numbered cards in each series.

starting with the 3xxx series they stopped launching titans but reintroduced x90.. so well I see them as titans.
some series had 2 seperate launches of titans.. which would be what the x90ti was..

what I miss in the 4xxx series is an x80ti.. classicly thats one of the cards a series launches with.... it launches with x70 and x80ti.. than 6 months later an x80 apears, and 6 months after than an x60 and titan (or x90) are added.

but ok fact remains ALL the best cards upto the 2xxx series.. including the titans.. never exceeded 250w (ok one titan had 270w.. but than I have to point at the many ones that used 230W.. so lets call it 250W +/- 20W for all the topmodels of both the regulair lineup and the titans.

which still makes todays tdp's insane.
De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 10.49 
Opprinnelig skrevet av rawWwRrr:
Soon you'll need a dedicated PSU just for the card, along with a gas generator to offset the power usage.

given how expensive gas is.. not an option either.. but a bunch of solarpanels.. might do.
-> given how compagnies now ask you to PAY for any surplus energy you give back..

yeah thats right..
lets say on an average day
your solar panels produce in 8 hours of daylight 10kwh your working so you only use 2kwh
at morning/night after sunhours you use 9kwh.

in the past they looked at total 10-2-9 = -1 so you got charged for 1kwh
now they do it different :
each kwh given back COSTS you a little money like 6 cents.. so thats 8x6 = 48 cent.
than at evening/night you got to pay what you use.. so thats 9 kwh you got to pay at about 60 cents a pop including taxes

so as you cannot compensate high use at night with overproduction at day.. and you actually need to PAY to deliver energy back to the grid now.. you better use it all.

aka pc on at daylight.. off at night.
De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 10.51 
Opprinnelig skrevet av _I_:
they are pretty much running into a wall of performance per watt
the only way to pass it is to increase power and cooling to get better performance from each cores

that to me means no gains at all..
they should just pump a little more R&D in their grafic card divisition now they risen them like 2 or 3 times more than just inflation should warrant.

sure they profit of AI.. but we gamers deserve R&D too..
just pump more voltage into it.. and cool it better thats overclocking.. we can do that ourself
what I need them to do is actually produce better power per watt producing silicone.
_I_ 18. juni 2024 kl. 10.55 
all gpu cores of the same gen are from the same design

high ones to x90, and lower ones down to x50 or mobile versions
Mr White 18. juni 2024 kl. 11.15 
yes in a way. Think about it as performance and features increase and are added to each gen. You will soon need at least 100w power supply
De Hollandse Ezel (Utestengt) 18. juni 2024 kl. 11.26 
Opprinnelig skrevet av Little Moon:
yes in a way. Think about it as performance and features increase and are added to each gen. You will soon need at least 100w power supply
1000/10000w you ment.
A&A 18. juni 2024 kl. 11.59 
I don't see a problem with the xx60.
RTX4060 has the same performance as RTX3060 and the TDP is basicly the same as RTX3050. 40% power efficiency improvement.

The only explanation I can see for the TDP increase in the high end is that they removed the SLI support. It is interesting that RTX ADA 6000 is better than RTX4090, while it is a 300W card. Obviously they have higher binned chips, but this difference is just too big?

Will the march continue? It depends on AMD.
Sist redigert av A&A; 18. juni 2024 kl. 12.00
PopinFRESH 18. juni 2024 kl. 13.25 
How many times are you going to post this same drivel? You have like 3 of this same thread in your post history.
PopinFRESH 18. juni 2024 kl. 14.16 
Opprinnelig skrevet av De Hollandse Ezel:
Opprinnelig skrevet av _I_:
they are pretty much running into a wall of performance per watt
the only way to pass it is to increase power and cooling to get better performance from each cores

that to me means no gains at all..
they should just pump a little more R&D in their grafic card divisition now they risen them like 2 or 3 times more than just inflation should warrant.

sure they profit of AI.. but we gamers deserve R&D too..
just pump more voltage into it.. and cool it better thats overclocking.. we can do that ourself
what I need them to do is actually produce better power per watt producing silicone.

Sounds super easy. Have you considered starting your own GPU manufacturing company? Surely you'll be able to produce a GPU that is better performance per watt than these power hungry Nvidia chips :steamfacepalm:

Opprinnelig skrevet av A&A:
I don't see a problem with the xx60.
RTX4060 has the same performance as RTX3060 and the TDP is basicly the same as RTX3050. 40% power efficiency improvement.

The only explanation I can see for the TDP increase in the high end is that they removed the SLI support. It is interesting that RTX ADA 6000 is better than RTX4090, while it is a 300W card. Obviously they have higher binned chips, but this difference is just too big?

Will the march continue? It depends on AMD.

I'd say it more depends on Intel at this point. AMD is getting squeezed and they are still way behind on RT performance / capability. If Intel delivers what it loos like from Battlemage; and they can finish getting their teeth cut on driver support for solid day-one playability on modern games then AMD is going to be in a real bad spot, likely being driven to target the low-mid end and will likely struggle to be price competitive with Intel at that range. At least until Intel starts gaining market share and decides to start pricing their GPUs at the price points they likely should be at.
< >
Viser 115 av 19 kommentarer
Per side: 1530 50

Dato lagt ut: 18. juni 2024 kl. 3.43
Innlegg: 19