Steam installeren
inloggen
|
taal
简体中文 (Chinees, vereenvoudigd)
繁體中文 (Chinees, traditioneel)
日本語 (Japans)
한국어 (Koreaans)
ไทย (Thai)
Български (Bulgaars)
Čeština (Tsjechisch)
Dansk (Deens)
Deutsch (Duits)
English (Engels)
Español-España (Spaans - Spanje)
Español - Latinoamérica (Spaans - Latijns-Amerika)
Ελληνικά (Grieks)
Français (Frans)
Italiano (Italiaans)
Bahasa Indonesia (Indonesisch)
Magyar (Hongaars)
Norsk (Noors)
Polski (Pools)
Português (Portugees - Portugal)
Português - Brasil (Braziliaans-Portugees)
Română (Roemeens)
Русский (Russisch)
Suomi (Fins)
Svenska (Zweeds)
Türkçe (Turks)
Tiếng Việt (Vietnamees)
Українська (Oekraïens)
Een vertaalprobleem melden
There was no 480 Ti.
There was no 580 Ti.
I'm not bothering writing the rest. You believe what you want. It's pointless for me to correct you, because you'll just come up with other wrong info soon after.
what you think NOPE means... and why I said partly true..
the 700 series was from 2013 thats over a decade ago.. so for a solid decade 80ti made part of the lineup..
and for over a decade... the launch sequence used to be
release with x70 and x80
and some months after an x60 and x 80ti
and some months after than an titan or x90.
the 4000 series is an outlyer here that there is no 80ti in it.
if you look at price for best and secondbest card per gen.. and powerddraw of best and secondbest card per gen..
you do see something weird happening with the 3000 and 4000 series..
they would need a 1200W psu
https://wccftech.com/nvidia-geforce-rtx-50-blackwell-launch-lineup-features-four-gaming-gpus-early-2025/
What a giant pile of BS, with all due respect.
But then again, it's the Steam forum and not a hardware/software forum.
Good for a laugh, though.
5080.. 400w.. should be 180w
5070.. 250w should be 130w
nvidea cut it return to normal tdp values and increase performance between generations with the SAME tdp.. not this idiotic stuff.
yanking up tdp means real performance gain is again near 0.
performance per watt no gains.
I'd leave speculation to just that, and wait and see what the power draw and power efficiency is, no predictions have every been right about it.
Namely, between the RTX 30 series and RTX 40 series, a "not insignificant" change to a more efficient process node occurred (Samsung 8nm to TSMC 5nm), and this is largely what allowed nVidia to advance in performance/watt with the RTX 40 series.
Such a shift is not going to happen this time. So any increases will instead have to come from either architecture changes (I don't think it's radically changing?) or increased power draw (which lends to things like adding more cores, having a higher frequency, etc.).
The majority of nVidia's lineup (namely, x80 and below) is basically just getting scraps from the AI sector, as nVidia can sandbag so much because AMD has given up. But nVidia knows the lower SKU buyers don't upgrade as often (and also that many of them only buy nVidia), so nVidia can peddle lower performance increases, lower VRAM amounts, etc. down there and people will still eat it up. But they're going to actually need a substantial enough uplift at the halo end if they want to lure the FOMO users into ditching an RTX 3090, 4080, and even 4090 for an RTX 5090. How much is "substantial enough" to encourage that, I don't know. But there's a good chance they might have to raise the power to do that. I'm not saying it will be 600W, but I am saying I definitely won't be surprised if it's higher than it is now, even if that is up to 600W. It really depends on how much of a performance improvement nVidia intends to make it over the RTX 4090, mainly.
A bust? Dude they literally out performed Nvidia at every price point. Infact Nvidia doesn't even dominate in RT under the 4070ti line. What drugs are you on?
No overclock needed, the 14900k can pull 450w right out the box whic will place it nicely...... huh... a few FPS above in games netting hundreds already or more often decently below..... while using 3x+ the power..... Yeah, whose still buying those?
Ok, so I agree that a big change isn't likely, especially when they promised 4x the performance of a 3090ti and gave 1.6x but the rest is non sense.
First, whats with this delusion people have that AI cards for servers have ANYTHING to do with gaming cards? Where did this nonsense come from? What makes you think this?
Its like the clowns saying 4090s were hard to get because all the dies were being used for the h100. They aren't the same die, not even related.
AI accelerators DO NOT have the gaming components needed for gaming because thats not what they are for. Gaming cards aren't "AI scraps". That makes ZERO SENSE.
Second what does "AMD has given up" even mean?
They announce they aren't doing a flagship card, they release the 7900xtx , everyone pretends it was supposed to go up against the 4090 then claim that statement made before means no *900xtx, *900xt, *800xt, or *700xt tier cards. I even see people claiming their newer cards are magically going to be slower than their last gen cards.
Again, what drugs are people taking?
Stop listening to teens on tiktok and randos on you tube making things up and calling them "leaks".
If the 7000 line wasn't canceled, wasn't recalled, wasn't dropped after the 7900xtx and 7900xt, RDNA3.5 wasn't skipped, RDNA 4 wasn't canceled, etc then why believe the latest nonsense rumor?
I swear its like watching QAnon followers.