RTX 5090 70% faster vs RTX 4090 ?!?
750W cards incoming or are they just going to show us some slides of a very specific optimised use case in a single program?

https://www.pcgamesn.com/nvidia/geforce-rtx-5090-performance

Edit: 675W*
Zuletzt bearbeitet von Raoul; 4. März 2024 um 4:04
< >
Beiträge 91105 von 128
In most games, my RTX 4080 doesn't even consume 200w... frame limit to 144fps and the card runs 100% silent and only at like 40-50% load.
These rumours are always started to get the enthusiasts eager to spend their money.
Ursprünglich geschrieben von The_Abortionator:
Ursprünglich geschrieben von RIP Heihachi Fk Kazuya:
over a year? doubt it

https://www.pcworld.com/article/1974281/nvidia-sets-2025-date-for-rtx-50-series-cards.html

Why are you guys like this?

You can't even follow Nvidia's OFFICIAL ANNOUNCEMENTS but take randos rumors as gospel.

Do you feel smart? Because you shouldn't.

Weird that you didn't read the article you linked. That isn't an "Nvidia's OFFICIAL ANNOUNCEMENTS". NVIDIA hasn't made an official announcement of their next gen consumer GPUs.
Ursprünglich geschrieben von C1REX:
Ursprünglich geschrieben von De Hollandse Ezel:

I agree with you, and as a self-employed, I often calculate how much something will cost me per year.

However, the power draw cost is not as straightforward to calculate when some top-end, high power target models can use less power in some scenarios than lower end models from the previous generation.

If power consumption were everything, then I would switch to laptop or handheld gaming. If being 100% financially responsible were my target, then I would quit gaming. There is also an aspect of self motivation when you want to make more money so you can afford a 1000W GPU.

we agree that gaming is a luxery.
but we do have income to waste on luxeries.
their gain is enjoyment.
however 1 euro more toward one type of luxery will not add as much enjoyment as towards another.

if you used to get x enjoyment for y spend.. and now that y gets you a lot less.. it sucks..

especially if those price hikes are way more than your wage/income. the gdp or inflation.

sure changes in demand / supply can cause this which sucks but cant be helped caviar and lobster were poor peoples food.. for example..

but not all price increases are due a change in demand vs supply.. it the cost of production went up.
again not the case here..

it can also be some evio person gets greedy and asks?whatever they want usually due the lack of competition

thats what happens here.. what you do if you see a 20 euro cup of coffee? you likely can afford but you dont out of principle.
as you stated gpu are not essential goods.
it saddends me not enough have that dicipline to do the same to send nvidea the signal we not accept these prices.

so now you are faced with 3 options
1 : flat out refuse to spend until the gouching stops (no more luxery for you but alowing you to allocate these funds to other luxeries.)
2 : accept less luxery for the same spending.. yes getting that crappy laptop where you used to game on the top of the current gen.
3 : alocate much more funds drawing away from other enjoyable luxeries in your life.

I logically enjoy neither of those options.

as a social responsible person I hate subsidised stuff.. but I also find the aspectof profit maximalisation deplorable.

when I in my ventures seemy neth profit margin rise above 10% (biblical ursery limit)
I see duty to :
A Rise my employees wage
B Lower price of my product
C increase quality of my product
if my margin dipsbelow 5% I do the reverse.
-
You could say I believe in profit moderation.. trying to keep it around that fair 7% that keeps the buisness healthy but also gives employees and customers a fait deal..
and thus HATE what nvidea is doing here..

it is not respectfull to customers like me who keps buying their products for decades.. its a short term mindless moneygrab.

just cause they could add 85% profitmargins by trippling what they charge while having decreased r&d budget.. and production cost stable..
aka
nvidea has opted while already making a way to high 30% margin to :
-pay its employees the same or less
-jack up the price of products
-lower quality of product
to jack that margin to the maximum..
which is WRONG
Zuletzt bearbeitet von De Hollandse Ezel; 5. März 2024 um 5:00
C1REX 5. März 2024 um 6:15 
Ursprünglich geschrieben von De Hollandse Ezel:
Profit margin is not everything as they need to make enough extra to keep developing new technologies.
If you don't like what NVIDIA is doing then vote with your wallet and don't buy the incoming 5090 or Nvidia's GPU at all.
Is that even possible?:wowshehe:
I'm so over graphic cards. I don't care how fast they are. I don't even care if they're fast enough to fly me around wearing them like shoes.

I want better games, not better cards. It's games that are suffering from mediocre, soulless excrement.
so say I am willing to allocate 100 euro a month to my pc gaming.
that 100 euro will be split between
-power to run it
-hardware cost

power to run it..
while I have no influence on the price per kwh... I can lock it in per 3 years.. so for a while I will know my rates which helps in the bookkeeping.

to bring this segment down I can :
-opt to use the pc less hours (hand in quality of life)
OR
-buy more power efficient parts (as i done with many devices in my household, spending more or parts that are more power efficient makes the cost per month less)

-> this gives a number that is derrived from the 100 a month.. that can be used for hardware purchases.

**I have no control over the market yanking part prices way more than inflation up as they've done now.. and it just means I can no longer afford to buy as good parts as I used to.

all I can do is use my budget to now build less capable parts.. at the same price..
which effictively means no progress at all
Ursprünglich geschrieben von C1REX:
Ursprünglich geschrieben von De Hollandse Ezel:
Profit margin is not everything as they need to make enough extra to keep developing new technologies.
If you don't like what NVIDIA is doing then vote with your wallet and don't buy the incoming 5090 or Nvidia's GPU at all.

I already kind of did...
I HAD to buy the overpricced 1600 2080ti due my 980ti breaking due bending in the 1st crypto crazy 1080ti this not for sale anywhere at any price.. and 2080ti the first 80ti that was available after having lived 15 months without ANY working gpu..

but while normally I would have bought an 4080ti.. with todays insane prices and powerdraw I just said NO.. and they better fix that with the 5xxx series if they want my money back..
C1REX 5. März 2024 um 9:41 
Ursprünglich geschrieben von De Hollandse Ezel:
but while normally I would have bought an 4080ti.. with todays insane prices and powerdraw I just said NO.. and they better fix that with the 5xxx series if they want my money back..

Power draw is becoming more and more important, but not buying a 4080 just for this reason alone is odd to me when it’s the most power-efficient GPU ever. Sure, it can draw over 300W when pushed, but it can also draw below 200W when set up for efficiency and still offer better performance in this power budget than any other GPU ever made.
Ursprünglich geschrieben von C1REX:
Ursprünglich geschrieben von upped_jetty_0o:
lol stop the hype train buddy it will be at best 30% faster in reality!
Why not?
The 4090 is 64% faster than the 3090.

Are these DLSS3 numbers or normal ones?

DLSS3 and I believe FSR3 is just stupid interp ♥♥♥♥. Not real frames.
Ursprünglich geschrieben von C1REX:
Ursprünglich geschrieben von De Hollandse Ezel:
but while normally I would have bought an 4080ti.. with todays insane prices and powerdraw I just said NO.. and they better fix that with the 5xxx series if they want my money back..

Power draw is becoming more and more important, but not buying a 4080 just for this reason alone is odd to me when it’s the most power-efficient GPU ever. Sure, it can draw over 300W when pushed, but it can also draw below 200W when set up for efficiency and still offer better performance in this power budget than any other GPU ever made.

I said 80ti.. which always was the same chip as the titans.. (the current x90)..

80ti always was a sizable chunk above the performance of an 80.. much closer to that x90 than an x80s..

basicly 80ti IS a x90 but a binned chip that is just BARELY not good enough to meet the x90 minimum.. and it comes with less gddr memory..

while costing a lot less.. so usually it is a quite effective pricepoing... 95 to 98% performance of the best card out there.. for only 2/3ds of it's pricr...

and they used to use 250W.. max.. many even did just 220 or 230W

while sure an x80ti generally did cost 700 euro.. double the 350 euro what an x70 did cost.. while only giving 40% more performance... but it was not as bad a deal as today prices.. and powerdraws..

the 4000 series is MISSING an x80ti.. while normally it is one of the first cards launched in a lineup..
Zuletzt bearbeitet von De Hollandse Ezel; 5. März 2024 um 10:07
C1REX 5. März 2024 um 11:34 
Ursprünglich geschrieben von De Hollandse Ezel:
I said 80ti.. which always was the same chip as the titans.. (the current x90)..

80ti always was a sizable chunk above the performance of an 80.. much closer to that x90 than an x80s..
I’m sorry but I had no chance guessing what you meant by changing NVidia’s own names.

The 4090 doesn't have a full die, as the Titans normally do. There was still space for Nvidia to release a 4090Ti or a Titan. They also had space for a 4080Ti, considering the huge gap in performance between the 4080 and the 4090.

4090 is still a very efficient GPU considering the performance and amount of VRAM. Not much worse than the 4080.
Zuletzt bearbeitet von C1REX; 5. März 2024 um 11:34
Ursprünglich geschrieben von De Hollandse Ezel:
the 4000 series is MISSING an x80ti.. while normally it is one of the first cards launched in a lineup..
This has almost never been true. The 2080 Ti was the only one in a long, long, long time.

You can change the official names all you want, it doesn't make you right, either.
Ursprünglich geschrieben von waffleciocc:
Ursprünglich geschrieben von De Hollandse Ezel:
the 4000 series is MISSING an x80ti.. while normally it is one of the first cards launched in a lineup..
This has almost never been true. The 2080 Ti was the only one in a long, long, long time.

You can change the official names all you want, it doesn't make you right, either.

partly true..

200 series nope
400 series nope
500 series nope
600 series nope
700 series yes
900 series yes
1000 series yes
2000 series yes
3000 series yes
4000 series nope..

before the 700 series the name of the best card was the x85 or x80
(the best 1 or 2 cards in each serie did already use 250W

with the 700 series basicly we got the 80ti as best card with 1 or more titans above that.

with the 3000 series those titans were renamed x90 and x90ti

meaning the 4000 series is the first without a 80ti in a decade..


780ti launched 6 months after the 770 and 780
980ti launched 9 months after the 970 and 980
1080ti launched 9 months after the 1070 and 1080
2080ti launched with the 2080ti and 2080.. (indeed that was the exeption)
3080ti launched 9 months after the 3070 and 3080

I said "a 80ti that launches not that long AFTER the series launch.)

if you look at those series in the last decade..

a 80ti that will be the 2d best card in a series.. that launches 9 months after the x70 and x80.. and an x60 that launches somewhere in between those two.. is about to be expected.
a titan or x90 card should launch about a year after the initial launch of a series so a bit after that 80ti is launched.
Zuletzt bearbeitet von De Hollandse Ezel; 5. März 2024 um 15:37
< >
Beiträge 91105 von 128
Pro Seite: 1530 50

Geschrieben am: 1. März 2024 um 6:41
Beiträge: 128