Alxndr 2025년 2월 6일 오후 4시 18분
Will high end GPUs be sold out forever?
The 4090 was released 2 years ago and it's still impossible to get one without buying second hand or resale from scalpers with a 400% price margin. Forget the 5080/90 absolutely no chance of ever owning one of them.

Even the 4080's are still sold out everywhere... I just don't understand it. When Apple releases a new iphone, they make tens of millions of them so everyone can easily get one and they make so much money from it.

But Nvidia only makes a few hundred thousand 50 series cards, Why? What benefit is there? Why not just make 10million of them and make billions more profit?

To me it's simple maths.

250,000 GPUs @ $2,000 each = $500million.
10million GPUs @ $2,000 each = $20billion.

Maybe Nvidia just isn't capable of creating that many GPUs I dunno but they sure as hell have no problem making millions of data centre GPUs.

I guess in about 6 years when the 50 series has been discontinued and no longer receiving hardware support or valid warranty etc I could probably scoop one up from a company trying to get rid of the last of its stock, but by then what's the point when all the games will require a newer 80 series card, which again Nvidia will only make a few hundred thousand of... and the cycle repeats.

I wouldn't mind if I had to wait a few weeks or even a couple months for production to pick up again or whatever but we're literally talking years, like 2+ years of no 5080's or 5090's in circulation. I've signed up to every major retailer in the GPU and signed up to be notified when they come back in stock but I suspect that email will never come... Or if it does, they'll be bought by bots before I can even click on the link in my inbox.

What a strange world we live in.
< >
86개 댓글 중 76-86개 표시
Dutchgamer1982 2025년 2월 11일 오전 6시 07분 
C1REX님이 먼저 게시:
_I_님이 먼저 게시:
gn has fps/watt comparisons for the gpus

https://gamersnexus.net/gpus/nvidia-geforce-rtx-5090-founders-edition-review-benchmarks-gaming-thermals-power#5090-efficiency-benchmarks

depending on the tasks, 4060-4090 are more efficient in fps/watt than 5090
but the 5090 is higher performing

Fair enough but thats just one way of looking at it. Max fps at max powerdraw.

I would like to see similar comparison at the exact same performance (60fps for example) and how much power each GPU needs to get there.
Same as when we compare cars in MPG (miles per gallon) and not max gallons/max speed.

For example:
4090 using 80W at 4K 120fps in less demanding games.
https://youtu.be/amwCqgfZHrs?si=GycnI3sPyLZ5HzpH

How much would a 2080Ti use? Or 3090Ti? Or 7900XTX?

dlss on.. fake frames.. invalidades score as that i would always turn off.

so dlss off upscaling off rtx off
scores only..
C1REX 2025년 2월 11일 오전 6시 45분 
Dutchgamer1982님이 먼저 게시:
dlss on.. fake frames.. invalidades score as that i would always turn off.

so dlss off upscaling off rtx off
scores only..
There is DLSS in the title but the game doesn’t support any form of DLSS upscaling or Frame Gen. Nor Ray Tracing. But thats not the point.

My point is to compare GPUs at the exact same performance.

If you want a GPU capable of 4K120fps native + RT(or without) and you refuse to play anything with less then I would like to see the exact same settings for each GPU and how much watts they draw. Or if they can even run the game at such settings or lose by default.
DefinitelyNotMonk 2025년 2월 11일 오전 6시 54분 
Dutchgamer1982님이 먼저 게시:
C1REX님이 먼저 게시:
That 250W is optional. You don’t need to use all the transistors at full power all the time.
5090 uses currently the most efficient transistors on the market. There are no better ones available.

If you want to use less power just play less demanding games or lower your settings.
Elden Ring uses about 250W at 4K with Ray Tracing on a 4090. Less than 200W at 1440p. Turning off Ray tracing saves power as well with almost no visual difference in this game. 5090 should behave similarly. With some simpler games a 4090 can draw less than 100W even at 4K 120fps.

The new GPUs are more efficient than ever before. They can also use more power if we ask them to do so for more performance. 4K and more FPS needs more energy.

yada yada

HIGH END -> Best card in each series (I skip the basicly sli on 1 card models but even those were <300W)

285 (2009) : 205w
480 (2010) : 250W
580 (2010) : 244W
680 (2012) : 195W
780ti (2013) : 230W
Titan (2013) : 230W
Titan black (2013) : 230W
980ti (2015) : 250W
Titan X (2015) : 250W
Titan XP (2016) : 250W
1080ti (2017) : 250W
Titan Pascal (2017) : 250W
Titan V (2018) 250W
2080ti (2018) (250W)
Titan RTX (2019) (280W)

so far so good..... and thats a decade of "most powerfull cards use 250W
every generation increasing performance about +40% without increasing tdp.

and that stuff got screwed (and why I stopped buying gpu's to send a signal NOT ok!)

3090 (2020) : 350W
3080ti (2021) : 350W
3090ti (2022) : 450W
4090 (2022) : 450W
5090 (2025) : 575W

see the issue here....!

and than there is the REAL performance gain (there is little)
-I not want dlss or upscaling that scap is turned off..
and when you turn it of.. the actual performance gain per generation has also stagnated bigtime with the 5000 series being truelly abysmal.. offering hardly ANY performance gain..

You never monitored your cards actual power draw or pushed them to their limits did you.

Those are all base model power limits and virtually every one of them would pull more if allowed to, the 980 (zotac amp extreme), let alone 980ti could pull 300 to 350watts, the 1080ti (evga ftw3 elite) could pull 350 to 400w happily all day long, the 3090 (asus strix oc) could pull close to 500w and I've had the 4090 at 540w (asus strix oc). I cannot remember what previous cards I owned actually pulled, I have never checked, but I pretty certain even my 3050 will pull more than stated.

So, no, you have not been running cards at 250w max unless you never pushed them, let alone overclocked them.

The new power draw limits are just more honest and they have the power plugs actually rated for it for once.
_I_ 2025년 2월 11일 오전 8시 30분 
they turned dlss off,
but frame gen cant be disabled

they tested ffxiv at 4k (does not use frame gen and dlss when at lowest settings)
5090 gave more fps, but needed more power/fps than 4090
and it was way more efficient than the 3090ti

frame gen and dlss do help with efficiency which skews results when its enabled

the 20xx cards would be off the bottom of the list since they have much lower fps and maybe ~20% lower power draw than the 4080
Dutchgamer1982 2025년 2월 11일 오후 12시 41분 
_I_님이 먼저 게시:
they turned dlss off,
but frame gen cant be disabled

they tested ffxiv at 4k (does not use frame gen and dlss when at lowest settings)
5090 gave more fps, but needed more power/fps than 4090
and it was way more efficient than the 3090ti

frame gen and dlss do help with efficiency which skews results when its enabled

the 20xx cards would be off the bottom of the list since they have much lower fps and maybe ~20% lower power draw than the 4080

I want zero fake frames. (they do not count towards a real score... if you have 120fps but that craps on.. you actually have a lower resolution and only 20% of the shown framerate.
basicly it is you stuffing your bra... it aint counting in a who has the biggest competion and it is utterly useless when you want to use the thing..

if the crap cannot be turned off.. than thats a product you would never want to buy anyway.
and if they can just not in this test.. don't use this test
Dutchgamer1982 님이 마지막으로 수정; 2025년 2월 11일 오후 12시 44분
¤☣wing☢zeяo☣¤™ 2025년 2월 11일 오후 2시 05분 
DefinitelyNotMonk님이 먼저 게시:
Dutchgamer1982님이 먼저 게시:

yada yada

HIGH END -> Best card in each series (I skip the basicly sli on 1 card models but even those were <300W)

285 (2009) : 205w
480 (2010) : 250W
580 (2010) : 244W
680 (2012) : 195W
780ti (2013) : 230W
Titan (2013) : 230W
Titan black (2013) : 230W
980ti (2015) : 250W
Titan X (2015) : 250W
Titan XP (2016) : 250W
1080ti (2017) : 250W
Titan Pascal (2017) : 250W
Titan V (2018) 250W
2080ti (2018) (250W)
Titan RTX (2019) (280W)

so far so good..... and thats a decade of "most powerfull cards use 250W
every generation increasing performance about +40% without increasing tdp.

and that stuff got screwed (and why I stopped buying gpu's to send a signal NOT ok!)

3090 (2020) : 350W
3080ti (2021) : 350W
3090ti (2022) : 450W
4090 (2022) : 450W
5090 (2025) : 575W

see the issue here....!

and than there is the REAL performance gain (there is little)
-I not want dlss or upscaling that scap is turned off..
and when you turn it of.. the actual performance gain per generation has also stagnated bigtime with the 5000 series being truelly abysmal.. offering hardly ANY performance gain..

You never monitored your cards actual power draw or pushed them to their limits did you.

Those are all base model power limits and virtually every one of them would pull more if allowed to, the 980 (zotac amp extreme), let alone 980ti could pull 300 to 350watts, the 1080ti (evga ftw3 elite) could pull 350 to 400w happily all day long, the 3090 (asus strix oc) could pull close to 500w and I've had the 4090 at 540w (asus strix oc). I cannot remember what previous cards I owned actually pulled, I have never checked, but I pretty certain even my 3050 will pull more than stated.

So, no, you have not been running cards at 250w max unless you never pushed them, let alone overclocked them.

The new power draw limits are just more honest and they have the power plugs actually rated for it for once.

The thing this guy didn't take in to account is the node they were made on, first GPU was 55nm, second was 40nm, now the 3rd was the 580 and was 40nm and pretty much the same GPU as the 480 so no gains (same as 4090/5090), we had good nm gains upto 30 series but now until silicon is replaced you only have brute force.

Now brute force cant go on unless we are fine with 2000W GPU's that require it's own air-con, and this is why rendering is changing and it's final form will be full neural rendering.

The people saying it's Nvidia's fault for little to no gain just have no clue how a GPU is made and that the gains come primarily from a node shrink, people think they cost a lot now imagine how much they would cost and even less stock if they went with the latest 3nm node.

Jenson knows all this and it's why he's changing rendering again like he did last time and that's why he is where he is and were just punks on a Steam forum shooting s#it.
_I_ 2025년 2월 11일 오후 2시 16분 
Dutchgamer1982님이 먼저 게시:
_I_님이 먼저 게시:
they turned dlss off,
but frame gen cant be disabled

they tested ffxiv at 4k (does not use frame gen and dlss when at lowest settings)
5090 gave more fps, but needed more power/fps than 4090
and it was way more efficient than the 3090ti

frame gen and dlss do help with efficiency which skews results when its enabled

the 20xx cards would be off the bottom of the list since they have much lower fps and maybe ~20% lower power draw than the 4080

I want zero fake frames. (they do not count towards a real score... if you have 120fps but that craps on.. you actually have a lower resolution and only 20% of the shown framerate.
basicly it is you stuffing your bra... it aint counting in a who has the biggest competion and it is utterly useless when you want to use the thing..

if the crap cannot be turned off.. than thats a product you would never want to buy anyway.
and if they can just not in this test.. don't use this test
they tested in the best ways they could

the ffxiv test showed what it could do

with frame gen off and dlss off, it had much higher fps, which is what counts on a high end gpu
but it needed more power, again, high end gpus do need alot of power

if you compare at different res and settings, the results are useless

the intel uhd igpu can use <40w and do 100+fps at 480p lowest settings
while the 5090ti needs >400w to do <30fps at 16k max settings
the uhd wins in fps per watt
_I_ 님이 마지막으로 수정; 2025년 2월 11일 오후 2시 19분
Tonepoet 2025년 2월 12일 오전 9시 06분 
I'm not going to promise everybody is going to get one before they're out of stock again, but for now, "forever" just ended up just being six days.

F.E. 5090s @ Best Buy @ M.S.R.P.[www.bestbuy.com]

Edit: Sold out already.. I did see the yellow purchase button though! XP
Tonepoet 님이 마지막으로 수정; 2025년 2월 12일 오전 9시 12분
The Thorne 2025년 2월 12일 오전 11시 22분 
The solution is too forget about upgrading , buy a new system with the card included and sell your old pc second hand. That’s how I got my 3080 on release day. I will do the same for 6000 series.
DefinitelyNotMonk 2025년 2월 12일 오후 12시 34분 
The Thorne님이 먼저 게시:
The solution is too forget about upgrading , buy a new system with the card included and sell your old pc second hand. That’s how I got my 3080 on release day. I will do the same for 6000 series.

But then you waste a bunch of cash and get stuck with bad part selection most of the time or pay way over the odds.

If you fail to grab one on release day, just wait until they are available, 6 months and there should be stock.
River 2025년 2월 12일 오후 6시 01분 
DefinitelyNotMonk님이 먼저 게시:
The Thorne님이 먼저 게시:
The solution is too forget about upgrading , buy a new system with the card included and sell your old pc second hand. That’s how I got my 3080 on release day. I will do the same for 6000 series.

But then you waste a bunch of cash and get stuck with bad part selection most of the time or pay way over the odds.

If you fail to grab one on release day, just wait until they are available, 6 months and there should be stock.

Yep.
< >
86개 댓글 중 76-86개 표시
페이지당 표시 개수: 1530 50

게시된 날짜: 2025년 2월 6일 오후 4시 18분
게시글: 86