安裝 Steam
登入
|
語言
簡體中文
日本語(日文)
한국어(韓文)
ไทย(泰文)
Български(保加利亞文)
Čeština(捷克文)
Dansk(丹麥文)
Deutsch(德文)
English(英文)
Español - España(西班牙文 - 西班牙)
Español - Latinoamérica(西班牙文 - 拉丁美洲)
Ελληνικά(希臘文)
Français(法文)
Italiano(義大利文)
Bahasa Indonesia(印尼語)
Magyar(匈牙利文)
Nederlands(荷蘭文)
Norsk(挪威文)
Polski(波蘭文)
Português(葡萄牙文 - 葡萄牙)
Português - Brasil(葡萄牙文 - 巴西)
Română(羅馬尼亞文)
Русский(俄文)
Suomi(芬蘭文)
Svenska(瑞典文)
Türkçe(土耳其文)
tiếng Việt(越南文)
Українська(烏克蘭文)
回報翻譯問題
well in the past an
x50 did not excist
x50ti did not excist
x60 was 200 euro (and considered a budgetcard
x60t did not excist
x60super did not excist
x70 was 350 euro (and considered a midrangecard)
x70ti did not excist
x70 super did not excist
x80 was 500 euro (and considered a midrangecard)
x80 super did not excist
x80ti was 700 euro (and considered high end)
x90 did not excist
x90ti was named "Titan" and priced 1100 euro (and considered high end.
granted there has been a lot of inflation in the last years.. since the last time the prices were like this.. there has been 20% inflation..
so than an
4060 should cost 240 euro
4070 should cost 420 euro
4080 should cost 600 euro
4080ti should cost 840 euro
4090ti should cost 1320 euro
which is CONSIDERABLY less than what they cost now.
a 4060 is 310 euro 30% too much
a 4070 is 600 euro 42% too much
a 4080 is 1200 euro 100% too much
4080ti has not yet been released
4090ti has not yet been released, but a 4090 is 2150 euro, 63% to much vs an titan price of old.
customers who used to buy a 980 would at the same pricepoint corrected for inflation be able to affound a 4070.
that feels a step down.
but in raw performance thats a solid 2 times as much.
however you ALSO pay by increading voltage use from 165W to 200W
and at 8hr a day use, with 40 cent per kwh, thats another 41 euro a year in extra electricity cost
As long as there is demand, NV, Intel and AMD aren't going to radically reduce prices. Why should they?
an 960 for 200 euro had 45% of the performance of an 970 pf 350.
the 970 was vest buy.
an 980 at 500 euro was thus +50% in price but added just 20% in performance.
an 980ti at 700 was 140% of an 970
but todays prices even with inflation factored in are outragious.
40xx won't work on 7 at all by the way.
cyrix and 3dfx kept intel/amd/nvidia prices low at the time
I make my first gaming PC with a 650ti, because I had a friend who knew nothing about PCs help me pick parts.
Next came the 980, within like a few months the 10 series came out.
Next came the 2080ti and within a few months the 30 series came out.
I ♥♥♥♥♥♥♥ suck when it comes to doing anything with a PC.
Usually the x50 was the entry level/budget gaming tier, the x60 was mid-range (typically around half the flagship) and the best value, and the x80 was high end. The x70 has traditionally been the price gap filler between the high end and mid range and people couldn't agree if it was upper mid-range or low high end (which is funny that now it represents less than what the x60 used to), but either way it was sort of niche because it lost on value to the x60 and lost on performance to the x80.
The GTX 960 was just that mediocre compared to the GTX 970 (much like RTX 4060 to RTX 4070), and the GTX 970 was the cheapest the x70 tier has ever been (and had that awkward 3.5 GB situation going on), so I wouldn't use it as a representation of how things have traditionally been.
if fact the 4xx 5xx 6xx 7xx series had the exact pricing sceme.
the 3xx and 2xx were not that far off either..
for a solid 15-20 years
a x60 was 200 euro
an x70 was 350
an x80 was 500
an x80ti was 700
and no x50 carss around me were considerd "office pc" cards. might just as well have no gpu at all. an x50 us not a gaming pc.
the x70 in a 2400ush euro pc was always seen as midrange. or mainstream.
an x60 in a 1200 euro pc was seen as extreme lowend.. below that it was not gaming.
amf believe me I have been to plenty a?lanparty in those earlt days.. and people loved to do nothing more than to show tjeir specs.
that the 960 performed bad is true..
but it the 970 was priced exactly lile a 770 and 670 before it.
260/360/460/560/660/760/960 was low end
x70 was lower mid end
x80 was higher mid end
x80ti was high end
titan was high end
nobody ever saw?an x70 as higer mid end ll.. what you describe is how an non ti x80 was always seen
I was just saying the 900 series was an anomaly in the regard of the x70 being seen as the more mainstream tier (similar to the current RTX 40 series generation). In every other generation, the x60 was the popular mid-range tier. It was never low end until recently.
Take a look at the Pascal generation. The GTX 1060 was half the specifications, and a bit more than half the performance, of the GTX 1080. It was half of the later GTX 1080 Ti. All while costing less than half of the original GTX 1080. Something that represents "just above half the flagship performance at less than half the cost" is not low end unless you have a very extreme enthusiast skew. Something above the midway point is not low end. That's literally mid-range by every definition of the word.
The x50 tier exists, and no those weren't ever "office cards" that were "no better than onboard". What? The x30 tier and below was what most considered "basic display adapter" territory. And the x50 was usually twice(ish) the performance of what the x30 was. And the x30 itself usually demolished onboard of its day. The x50wasn't super performant compared to the actual high end, but it typically did games just fine with some concessions. The x60 usually did high in modern games of its time. The tiers above that were just more frame rates, more longevity, and some extra settings that usually didn't add much.
The x60 is only somewhat low end now as it represents what the x50 used to. It definitely wasn't always that way.
Edit: And saying there didn't used to be an x60 Ti is wrong too. When the moniker was first brought back (after last being used on the GeForce 4 Ti series), it was with the GTX 560 Ti, or in other words on an x60 tier card. It was literally on this tier before anywhere else.
Performance isn't the problem. It's lack of newer features like ray-tracing and mesh shaders.
now i think starfield is affecting my lappy's 1050 as it starting to get artifacts in games
Performance isn't the problem. It's lack of newer features like ray-tracing and mesh shaders. [/quote]
in the origigina days late 90s.., early 00s. say 1995-2005
if you wanted to play a game at release :
if you had a budget 1200 euro pc it only be able to boot new release if the hardware was 6 or less months from its hardware releade date.
so like if an gpu was relessed in march any game released in oktober of that same year would just not boot at all.
and any game that was released within 5 years of that gpu release date.. would run at extemely low settings at best.
when the x60 much later came around it fitted in this pricepoint.
as the rest in such a pc would also age beyond running your games you had to fully toss such a pc every 3 years and replave it.
-------
an 2400 euro pc is lower midrange.
an 3600 euro pc is higher midrange
you followed the rule
1/4th for perfials
1/4th for gpu
1/4th for mobo+cpu+ram
1/4th for rest
a gpu in this ramge would generally be able to run games that released before its release date in medium quality. hence midrange gaming.
-
titles that came out after its release date would still boot in low settings, but after about 2yr the geberational gap was such even it would fysicly no longer boot newer titles.
this is why you than did an upgrade, your cpu unluke the budget system still was good. but you generally did have to double ram and hd space..
but you usually tossed your gpu and bought a new one.. to enjoy another 2 years before you had to trash the entire system as it would not run newer titles at all.
this is where the idea of : you must buy a new gpu every second gpu gen.
so if you have an 980you skip 1080 and buy 2080 as it releases than skip 3080 and buy 4080 as it released.
the xx70 and xx80 while fitted in this pricerange.
-------------
high end was not hardly defuned but generallu starts around 4800 euro for a system but can go up to 4 times that.
high end would not skip generations they would update more often replaving parts all the time not full systems..
unlike midend they often sold their old stuff as secondhand as npbody would pay a cent for an 2yo gpu but an 1yo one still had some value.
this ment that high end players could run every game at release on medium settings.
--this pricerange translated later the x80ti and titan cards
=======
high and ultra settongs NOBODY could run at launch.
usually a game had to be more than 5 years older than hardware before hardware excisted if you had unlimited pockets that could run it on high settings.
for your more common midrange gamer who only spends 100 euro a month on hardware.. it usually took more like 10 years..
========
the fact even 20yo potatoes can run todays new releases at all I see as evidence of the decline in quality (a lot of it is to blame to the error of making games multiplatform.. pcs hardwarewise alwaysare 10-15 year ahead of consoles.. and so were the games.. so games would be ported to console 15 or 20 years AFTER their pc release.
by releasing games for console and pc at the same time you had to make them crappier.. slowest ship in the fleet effect.
**
I can state most defenitily an 750 and all the x50w before it were seen as office cards.
lile if you had to do some photoshop or othrr light grafical programs that needed a dedicsted gpu but only a minimum obe thats were the x50 is for.
the x60 always was the lowest card consideted "for gaming"
and yeah I know the ti from cards like the geforce 2 ti500 I also owned.
and in the 5xx series they did indeed abuse it.
-we always saw it as wrong basicly a way to dumb surpluss npbody is buying or thats defective.. a sign of weakness.
ib good times the obly ti card you should see is the 80.
when we have production trouble.. when they annot meet demand.. thats when thry toss all those in between ti and super cards.
-
be it that a chip procedure is so ineffectice they pile up a lot of below spec chips.. (has happened when they tried new chip designs but every wafer obly had like 3 good chips)
and ofcourse recently with the cryptomania.
though it us true ati and nvidua always have been wastefull in how many models they release in a series.
I found vodoo's release plan of only 2 carda in a series like a 4 4000 and 4 4500 much cleaner.
in those seruea the weakest card of a new series was also always more powerfull than the strongest in the old series making it even cleaner.