Will AMD ever release a 9090XT?
You know, to rival nvidia RTX 5090 which has proven to be a flop due to fire hazard, jacked up prices and limited availability.

AMD should capitalize on this and release the 9090 XT.
< >
Visualizzazione di 16-22 commenti su 22
Messaggio originale di Bad 💀 Motha:
Messaggio originale di ¤☣wing☢zeяo☣¤™:
They can't, AMD uses more power to get the same result so a 5090 rival would draw 700/800W.

They do have room to release a bigger die GPU peg it at 550W, might be a 5080 killer, hard to know when just speculating, you could do the maths to get a rough idea but power draw vs performance isn't linear so still tricky to be accurate.

Not necessarily.

The 9070 XT does about as expected at a TDP range of 275-320W

7900 XTX is rated for 350-420W

If AMD was serious and did it right they could probably do a 4090 level GPU and stay around 500W. Even a 4090 you can under volt it and push it to use less max power while still maintaining peak performance in most things

I'm just making a rough guess from the Digital Foundry findings when they looked at power draw on the latest architecture, I don't think it's possible especially at full tilt with INT8 going, my 5090 uses 450/480W as a GPU add all the RT/Tensor process in to the mix it's more like 550/570W.

Example Black myth Wukong 9070 XT and 5070 Ti get about the same FPS, 5070 Ti is a smidge ahead but it does it using 100W less power, oter games benched came in about 60W less power on average, Nvidia is so far ahead in efficiency, then take performance vs power draw isn't linear, so a 9070 draws about 260W if you double the die and let it draw 520W you don't get double the performance.

Best they can do it compete with the 5080 but then Nvidia can bring the 5080 Ti that sits in the middle of 5080 and 5090.
5090 is a joke, drawing much more power for very little gaming gains compared to a 4090
Messaggio originale di ¤☣wing☢zeяo☣¤™:

I'm just making a rough guess from the Digital Foundry findings when they looked at power draw on the latest architecture, I don't think it's possible especially at full tilt with INT8 going, my 5090 uses 450/480W as a GPU add all the RT/Tensor process in to the mix it's more like 550/570W.

Example Black myth Wukong 9070 XT and 5070 Ti get about the same FPS, 5070 Ti is a smidge ahead but it does it using 100W less power, oter games benched came in about 60W less power on average, Nvidia is so far ahead in efficiency, then take performance vs power draw isn't linear, so a 9070 draws about 260W if you double the die and let it draw 520W you don't get double the performance.

Best they can do it compete with the 5080 but then Nvidia can bring the 5080 Ti that sits in the middle of 5080 and 5090.

I wouldn't use an unoptimised game using Nvidia's closed source implementations no one can examine or scrutinise as a point of reference for anything.
Messaggio originale di Bad 💀 Motha:
5090 is a joke, drawing much more power for very little gaming gains compared to a 4090
35% extra performance
28% tdp increase
33% more vram
Gddr7
For 25% higher msrp

That's the opposite of a joke to me and were it not for the 12vhpwr fiasco I would have bought one - mind you at msrp - since both Ada and Blackwell are efficient arch that allow you to achieve almost the same performance by uv and oc
Ultima modifica da Corona Scurrae; 23 mar, ore 0:03
I'm a 300 watt GPU guy.
Even if AMD made that and it was cheap, I would not buy that.
Because I am a 300 watt GPU guy. It heats my room when I game.
600 watts and I would need to turn on an air conditioner.
Not adding that additional cost to game.
Messaggio originale di AmaiAmai:
Messaggio originale di ¤☣wing☢zeяo☣¤™:

I'm just making a rough guess from the Digital Foundry findings when they looked at power draw on the latest architecture, I don't think it's possible especially at full tilt with INT8 going, my 5090 uses 450/480W as a GPU add all the RT/Tensor process in to the mix it's more like 550/570W.

Example Black myth Wukong 9070 XT and 5070 Ti get about the same FPS, 5070 Ti is a smidge ahead but it does it using 100W less power, oter games benched came in about 60W less power on average, Nvidia is so far ahead in efficiency, then take performance vs power draw isn't linear, so a 9070 draws about 260W if you double the die and let it draw 520W you don't get double the performance.

Best they can do it compete with the 5080 but then Nvidia can bring the 5080 Ti that sits in the middle of 5080 and 5090.

I wouldn't use an unoptimised game using Nvidia's closed source implementations no one can examine or scrutinise as a point of reference for anything.

That's why I said other games came in about 60W difference on average.
Messaggio originale di Bad 💀 Motha:
5090 is a joke, drawing much more power for very little gaming gains compared to a 4090

Yeah but what else could they do having to use the same node, same as when we wen't from GTX 480 to GTX 580 and the both used 40nm, most gains come from the transistor density.
< >
Visualizzazione di 16-22 commenti su 22
Per pagina: 1530 50

Data di pubblicazione: 22 mar, ore 1:31
Messaggi: 22