RTX 5090 70% faster vs RTX 4090 ?!?
750W cards incoming or are they just going to show us some slides of a very specific optimised use case in a single program?

https://www.pcgamesn.com/nvidia/geforce-rtx-5090-performance

Edit: 675W*
Laatst bewerkt door Raoul; 4 mrt 2024 om 4:04
< >
106-120 van 128 reacties weergegeven
Origineel geplaatst door De Hollandse Ezel:
Origineel geplaatst door waffleciocc:
This has almost never been true. The 2080 Ti was the only one in a long, long, long time.

You can change the official names all you want, it doesn't make you right, either.

partly true..

200 series nope
400 series nope
500 series nope
600 series nope
700 series yes
900 series yes
1000 series yes
2000 series yes
3000 series yes
4000 series nope..

before the 700 series the name of the best card was the x85 or x80
(the best 1 or 2 cards in each serie did already use 250W

with the 700 series basicly we got the 80ti as best card with 1 or more titans above that.

with the 3000 series those titans were renamed x90 and x90ti

meaning the 4000 series is the first without a 80ti in a decade..
There was no 280 Ti.
There was no 480 Ti.
There was no 580 Ti.

I'm not bothering writing the rest. You believe what you want. It's pointless for me to correct you, because you'll just come up with other wrong info soon after.
Origineel geplaatst door waffleciocc:
Origineel geplaatst door De Hollandse Ezel:

partly true..

200 series nope
400 series nope
500 series nope
600 series nope
700 series yes
900 series yes
1000 series yes
2000 series yes
3000 series yes
4000 series nope..

before the 700 series the name of the best card was the x85 or x80
(the best 1 or 2 cards in each serie did already use 250W

with the 700 series basicly we got the 80ti as best card with 1 or more titans above that.

with the 3000 series those titans were renamed x90 and x90ti

meaning the 4000 series is the first without a 80ti in a decade..
There was no 280 Ti.
There was no 480 Ti.
There was no 580 Ti.

I'm not bothering writing the rest. You believe what you want. It's pointless for me to correct you, because you'll just come up with other wrong info soon after.

what you think NOPE means... and why I said partly true..

the 700 series was from 2013 thats over a decade ago.. so for a solid decade 80ti made part of the lineup..

and for over a decade... the launch sequence used to be

release with x70 and x80
and some months after an x60 and x 80ti
and some months after than an titan or x90.

the 4000 series is an outlyer here that there is no 80ti in it.

if you look at price for best and secondbest card per gen.. and powerddraw of best and secondbest card per gen..
you do see something weird happening with the 3000 and 4000 series..
Laatst bewerkt door De Hollandse Ezel; 5 mrt 2024 om 15:40
I use a 4090 to play max 4k over 100 fps. I get really bad migraines and the higher the fps and graphics the less I get them. I am at the point where I can play even if I get a head ace. The emersion factor is why you pay the big bucks. 4090 with a gsync monitor does wonders. I also am waiting one my omni one next month which will have double the rez on a vr head set vs anything else on the market. I will admit the 4090 heats my house and I have a large rig with 17 fans. I also have a duel hose window ac unit from my ethereum mining days to help with that.
yeah imagine those poor souls with overclocked intel 14900K/13900K
they would need a 1200W psu
Good, now release the RTX 5090Ti (Titan x). :csd2smile:
Well on paper the 5090 in the leaks (lol) it seems to be about 40%, higher SM, core and all aspects of vram performance while keeping at 600w... guess we will see about that though! Was planning an ssf project with 1440p in mind so will be looking at options soon. Wish they wouldn't delay the 5060 but hey half the market is hype fomo.

https://wccftech.com/nvidia-geforce-rtx-50-blackwell-launch-lineup-features-four-gaming-gpus-early-2025/
Origineel geplaatst door Bloodsorrow:
I use a 4090 to play max 4k over 100 fps. I get really bad migraines and the higher the fps and graphics the less I get them. I am at the point where I can play even if I get a head ace. The emersion factor is why you pay the big bucks. 4090 with a gsync monitor does wonders. I also am waiting one my omni one next month which will have double the rez on a vr head set vs anything else on the market. I will admit the 4090 heats my house and I have a large rig with 17 fans. I also have a duel hose window ac unit from my ethereum mining days to help with that.

What a giant pile of BS, with all due respect.
But then again, it's the Steam forum and not a hardware/software forum.

Good for a laugh, though.
Origineel geplaatst door ChickenBalls:
yeah imagine those poor souls with overclocked intel 14900K/13900K
they would need a 1200W psu
Insanity levels of power draw.
Laatst bewerkt door Vince ✟; 25 nov 2024 om 7:43
5090.. 600w.. should be 250w
5080.. 400w.. should be 180w
5070.. 250w should be 130w

nvidea cut it return to normal tdp values and increase performance between generations with the SAME tdp.. not this idiotic stuff.

yanking up tdp means real performance gain is again near 0.
performance per watt no gains.
Laatst bewerkt door De Hollandse Ezel; 25 nov 2024 om 7:56
Everyone thought the 4090 was going to be a 600+ watt card on launch, it's 450 watt.

I'd leave speculation to just that, and wait and see what the power draw and power efficiency is, no predictions have every been right about it.
Origineel geplaatst door Komarimaru:
Everyone thought the 4090 was going to be a 600+ watt card on launch, it's 450 watt.

I'd leave speculation to just that, and wait and see what the power draw and power efficiency is, no predictions have every been right about it.
Though I agree with the "wait and see" approach, and while rumors can be all over, they tend to become less all over as time draws closer, and there's sound reasoning as to why we shouldn't expect a repeat of last time.

Namely, between the RTX 30 series and RTX 40 series, a "not insignificant" change to a more efficient process node occurred (Samsung 8nm to TSMC 5nm), and this is largely what allowed nVidia to advance in performance/watt with the RTX 40 series.

Such a shift is not going to happen this time. So any increases will instead have to come from either architecture changes (I don't think it's radically changing?) or increased power draw (which lends to things like adding more cores, having a higher frequency, etc.).

The majority of nVidia's lineup (namely, x80 and below) is basically just getting scraps from the AI sector, as nVidia can sandbag so much because AMD has given up. But nVidia knows the lower SKU buyers don't upgrade as often (and also that many of them only buy nVidia), so nVidia can peddle lower performance increases, lower VRAM amounts, etc. down there and people will still eat it up. But they're going to actually need a substantial enough uplift at the halo end if they want to lure the FOMO users into ditching an RTX 3090, 4080, and even 4090 for an RTX 5090. How much is "substantial enough" to encourage that, I don't know. But there's a good chance they might have to raise the power to do that. I'm not saying it will be 600W, but I am saying I definitely won't be surprised if it's higher than it is now, even if that is up to 600W. It really depends on how much of a performance improvement nVidia intends to make it over the RTX 4090, mainly.
Laatst bewerkt door Illusion of Progress; 25 nov 2024 om 12:53
Origineel geplaatst door 10PushUpsToRedemption:
Origineel geplaatst door C1REX:
Why not?
The 4090 is 64% faster than the 3090.
going from samsung's garbage 8nm to tsmc's 4nm. ada lovelace is crazy efficient

Origineel geplaatst door Rumpelcrutchskin:
Not gone afford this monster anyway. I´m WAY more interested in RX 8800 XT, it looks pretty damn good so far.
I would wait with amd cards. rdna 3 was a bust and rdna 4 is apparently not what they wanted. I think rdna 5 or 6 is when they come back and can seriously compete with nvidia.


A bust? Dude they literally out performed Nvidia at every price point. Infact Nvidia doesn't even dominate in RT under the 4070ti line. What drugs are you on?
Origineel geplaatst door ChickenBalls:
yeah imagine those poor souls with overclocked intel 14900K/13900K
they would need a 1200W psu

No overclock needed, the 14900k can pull 450w right out the box whic will place it nicely...... huh... a few FPS above in games netting hundreds already or more often decently below..... while using 3x+ the power..... Yeah, whose still buying those?
Origineel geplaatst door Illusion of Progress:
Origineel geplaatst door Komarimaru:
Everyone thought the 4090 was going to be a 600+ watt card on launch, it's 450 watt.

I'd leave speculation to just that, and wait and see what the power draw and power efficiency is, no predictions have every been right about it.
Though I agree with the "wait and see" approach, and while rumors can be all over, they tend to become less all over as time draws closer, and there's sound reasoning as to why we shouldn't expect a repeat of last time.

Namely, between the RTX 30 series and RTX 40 series, a "not insignificant" change to a more efficient process node occurred (Samsung 8nm to TSMC 5nm), and this is largely what allowed nVidia to advance in performance/watt with the RTX 40 series.

Such a shift is not going to happen this time. So any increases will instead have to come from either architecture changes (I don't think it's radically changing?) or increased power draw (which lends to things like adding more cores, having a higher frequency, etc.).

The majority of nVidia's lineup (namely, x80 and below) is basically just getting scraps from the AI sector, as nVidia can sandbag so much because AMD has given up. But nVidia knows the lower SKU buyers don't upgrade as often (and also that many of them only buy nVidia), so nVidia can peddle lower performance increases, lower VRAM amounts, etc. down there and people will still eat it up. But they're going to actually need a substantial enough uplift at the halo end if they want to lure the FOMO users into ditching an RTX 3090, 4080, and even 4090 for an RTX 5090. How much is "substantial enough" to encourage that, I don't know. But there's a good chance they might have to raise the power to do that. I'm not saying it will be 600W, but I am saying I definitely won't be surprised if it's higher than it is now, even if that is up to 600W. It really depends on how much of a performance improvement nVidia intends to make it over the RTX 4090, mainly.


Ok, so I agree that a big change isn't likely, especially when they promised 4x the performance of a 3090ti and gave 1.6x but the rest is non sense.

First, whats with this delusion people have that AI cards for servers have ANYTHING to do with gaming cards? Where did this nonsense come from? What makes you think this?

Its like the clowns saying 4090s were hard to get because all the dies were being used for the h100. They aren't the same die, not even related.

AI accelerators DO NOT have the gaming components needed for gaming because thats not what they are for. Gaming cards aren't "AI scraps". That makes ZERO SENSE.

Second what does "AMD has given up" even mean?

They announce they aren't doing a flagship card, they release the 7900xtx , everyone pretends it was supposed to go up against the 4090 then claim that statement made before means no *900xtx, *900xt, *800xt, or *700xt tier cards. I even see people claiming their newer cards are magically going to be slower than their last gen cards.

Again, what drugs are people taking?

Stop listening to teens on tiktok and randos on you tube making things up and calling them "leaks".

If the 7000 line wasn't canceled, wasn't recalled, wasn't dropped after the 7900xtx and 7900xt, RDNA3.5 wasn't skipped, RDNA 4 wasn't canceled, etc then why believe the latest nonsense rumor?

I swear its like watching QAnon followers.
< >
106-120 van 128 reacties weergegeven
Per pagina: 1530 50

Geplaatst op: 1 mrt 2024 om 6:41
Aantal berichten: 128