Installer Steam
log på
|
sprog
简体中文 (forenklet kinesisk)
繁體中文 (traditionelt kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tjekkisk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spanien)
Español – Latinoamérica (spansk – Latinamerika)
Ελληνικά (græsk)
Français (fransk)
Italiano (italiensk)
Bahasa indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (hollandsk)
Norsk
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasilien)
Română (rumænsk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (Vietnamesisk)
Українська (ukrainsk)
Rapporter et oversættelsesproblem
I know there are some people who would like to buy it for this price.
5070ti £899
https://www.cclonline.com/rtx-5070-ti-16g-ventus-3x-oc-msi-geforce-rtx-5070-ti-ventus-3x-oc-16gb-gddr7-graphics-card-483599/
7090xt £799
https://www.cclonline.com/90yv0l71-m0na00-asus-radeon-rx-9070-xt-prime-oc-16gb-gddr6-graphics-card-477117/
7090xt £899
https://www.cclonline.com/90yv0l70-m0na00-asus-radeon-rx-9070-xt-tuf-oc-16gb-gddr6-graphics-card-477116/
Still, only £150 over MSRP isn't too bad.
Then so why is it a 9070 XT and 7900 XTX you can run RT + PT in Cyberpunk 2077 and still get like 100 FPS just fine.
The 9079xt is far better as it finally has dedicated hardware, but, compare them to nvidias option, it's way behind, 40 series stomping the 7900xtx and if you use MFG the 50 series runs away with it to.
Not using available technology because one side doesn't have it is not a fair comparison if you are trying to compare stuff at a given price point, nvidia costs more, but it has more features.
theres a used Aorus 5080 on Amazon :p
£1k
New FSR and ray tracing are WAY better on 9070 and 9070 XT then previous generations.
Saw Cyberpunk benchmark with ray tracing between 9070 and 4070 Super and performance was identical. Without ray tracing it beat 4070 Super by average 30 fps.
Gone
Recency bias has people painting the GTX 1080 Ti of all things as the greatest of all time and I find that amusing. It was good... for a high end product, but due to that it falls too short of qualifying as a real contender for greatest of all time. Most of what makes it good extends to the Pascal lineup as a whole; it just had good lasting power because it was generous on VRAM and the generations since then have been mediocre price/performance improvements compared to it. Naturally, the fastest one in the lineup has the best staying power but what makes it good wasn't exclusive to it. I don't even think its price/performance was better than the GTX 1060, which launched a year earlier. It was good... but far from greatest of all time.
Meanwhile, the 8800 GT almost matched the 8800 GTX/Ultra in performance... while being less than half the price of the cheaper of those two. A similar thing could be said of the Ti 4200.
Those two are the real contenders for greatest of all time, at least among nVidia's stuff. These two cards in particular almost warped the market around them to the extent of making everything else irrelevant because their price/performance was just that much better than everything else (while also nearly matching the top performer so it made them almost irrelevant too). The GTX 1080 Ti didn't come close to doing any of that. People say nVidia's mistake they will never repeat was the GTX 1080 Ti, but no, their real mistake was those other two. nVidia has already repeated the GTX 1080 Ti anyway; it's called the RTX 4090 at MSRP around launch.
(ATI/AMD undoubtedly had some exceptional price/performance offerings back then but I'm less familiar with them, so my omission of anything ATI/AMD isn't saying they didn't exist, just that I'm unaware of what they were.)
1996 3dfx Voodoo the dawn of 3D gaming or maybe 1999 GeForce 256 the first "GPU"
I'll give you the 8800 GTX did unify shader arcitecture which was a leap but not as big as the other two.
I never said dying. I know it's far from that. But they need some real serious changes non the less.
Agreed, no less as the 8800 GT benefitted of a shrink -- whilst news are teasing that the next shrink may actually INCREASE costs, thanks to TSMC. https://www.tomshardware.com/tech-industry/tsmcs-wafer-pricing-now-usd18-000-for-a-3nm-wafer-increased-by-over-3x-in-10-years-analyst
History of the 8800 GT: https://www.anandtech.com/show/2365
Re: 1080 Ti: One of the main reasons why the 1080Ti (sometimes even 1060) is still being in use isn't but Pascal as such (good architecture, mind). But consoles -- and game development cycles getting longer and longer. Pascal was released when the PS5 was still years away. And when it got released, it took years for crossgen PS4xPS5 titles to disappear, let alone games to appear that make RT mandatory (Indiana Jones, Doom next). Gonna be interesting how long the RTX 4090/5090 classes are still gonna be used. As AMD/Sony can't pull off magic for the PS6, considering the price explosions...