Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
I went from a 1080 ti to a 2080 Super, it doesn't matter if you have 11GB of VRAM if none of the games you're playing are using it. And in that case you know what matters more? Computing power and hardware features.
nothing against the 1080 ti, it's a good card and it's still better than what the majority of people are running. And that's pretty impressive for a 5.5 year old card (released March 2017). But unless you're getting a good deal on it, a 3060 ti is gonna be a better choice. Or maybe a 4060/4050 when they're released next year.
GTX's days are counted already.
And yeah, more VRAM doesnt automatically mean more power or is better. You need to take a look on TGP and MHz values as well as compare CUDA Cores. That more that better.
What also needs consideration is it probably has less of a lease on life in terms of driver support. Pascal driver support probably isn't going anywhere anytime real soon; too many are still in use for nVidia to risk cutting them off without upsetting even more people (unless they continue their defiant stance in the face of backlash, which... you never know). But that more applies to people already using one. It's too late to be investing into one now unless you're getting it at a very worthwhile price. Driver support becomes a concern with higher tier cards especially. With lower tier ones, they're usually slow by time that happens anyway, but the higher tier ones retain some performance when support is lost. And since people tend to sell and value things based on performance now and not "driver support maybes", it's a hidden factor you shouldn't overlook as a used GPU buyer. I recall a video (Gamers Nexus?) did where they compared a lot of older cards, and there were faster ones that had no driver support that actually fared worse at games than some slower ones because of it. Some games just wouldn't start or run as well.
Also, nVidia's GPUs, like Intels older CPUs, tend to hold value and this makes them not as worthwhile buying used IMO. Double so for the higher end stuff. Used quad core Core i7s (especially 6th and 7th generation, but even second through forth generation) were going for nonsensical prices on the used market some time back. No idea if they still are.
As always, if you find a good deal on it, sure. But compare the price to what else you can get. A newer card, even a lower tier, at the same price/performance would be an easy choice over it IMO for a number of reasons. A faster card at the same price or cheaper would be an even easier choice over it.
I still have one. It is decent.
I am looking to upgrade soonish, ideally when GPU prices crash to the floor.
It has a good amount of VRAM, but no hardware raytracing or DLSS support.
Vram only matters if you're doing something that requires that much of vram, like dumb amount of mods that blow pass 8GB, I use over 400 mods with GTX 1070 on Skyrim, that should give you idea, or if this is for rendering 3D work project, you should look towards something with much more VRAM if that matters a lot.
Get rtx 3060, it has 12 gb Vram and has near same performance as 1080 ti, also much more power efficient. And comes with DLSS, which will blow 1080 ti out of the water.
If OP spending near price of 3060 to get 1080 ti, might as well just get 3060 as no point grabbing a used old card, what also unknown if card was mine on, or on it last par of legs since card release 6 years ago.
3060 (12gb) vs 306ti (8gb) good example
Well 4K is pretty demanding and that hasn't changed recently. Expecting midrange cards that are only as fast as highend cards a few years ago, that people didn't think were really fast enough for 4K on titles from the some period, to run 4k on newer titles might be a little optimistic/naive.
We're getting closer to 4K being viable of midrange hardware, but we're not there yet.