Nainstalovat Steam
přihlásit se
|
jazyk
简体中文 (Zjednodušená čínština)
繁體中文 (Tradiční čínština)
日本語 (Japonština)
한국어 (Korejština)
ไทย (Thajština)
български (Bulharština)
Dansk (Dánština)
Deutsch (Němčina)
English (Angličtina)
Español-España (Evropská španělština)
Español-Latinoamérica (Latin. španělština)
Ελληνικά (Řečtina)
Français (Francouzština)
Italiano (Italština)
Bahasa Indonesia (Indonéština)
Magyar (Maďarština)
Nederlands (Nizozemština)
Norsk (Norština)
Polski (Polština)
Português (Evropská portugalština)
Português-Brasil (Brazilská portugalština)
Română (Rumunština)
Русский (Ruština)
Suomi (Finština)
Svenska (Švédština)
Türkçe (Turečtina)
Tiếng Việt (Vietnamština)
Українська (Ukrajinština)
Nahlásit problém s překladem
why? it be FANTASTIC...
lets face it... VERY few people own a 4090 right now..
whats marketshare?
most will likely still own a 2xxx or 3xxx series or even older.. and those that do own 4xxx series likely most own 4070 or lower...
and we don't NEED a large jump in peformance... with the said lowering in prices..
for most they will get a lot better product for the same money... WITH a high boost in performance..
high end users.. still get to enjoy the fact "they got bragging rights of owning the best there is"
without needing to spend quite as much.. which is great for all they want is bragging rigts..
the sceme as set... will both in energy bill AND pricing put 4k a lot closer to a lot of users..
than with the current trend of insane powerdraws and insane prices...
a xx70 was historicly 350 euro.. an titan or xx90 1100 euro.. and used 250W..
todays prices.. and powerdraw is just out of control.
and that serves no one..
high end gaming 700 euro for an xx80ti and 1100 euro for an titan/xx90
and xx60 at 200 euro was considered low end gaming
and NOBODY would consider gaming on an xx50 those were considered "office cards"
This is the present, not the past.
Go look up how much a 5nm TSMC wafer costs, compared to 28nm.
I have gamed in the 80s.. you are just spoiled.
I also play mostly strategy and occational some rpg.. looks all fine to me..
I do care about 4k more cause it will give me more of the map in 1 field of sight..
which is a strategic advantage.. but only if I keep decent enough fps to micro well..,.
and many games just lack proper coding.. gpu's should not be made to make up for lazy game developers not coding well..
Have a nice day.
gpu prices were like that for over 15 years... and in the prices I listed.. I already corrected those for the ACTUAL inflation.
TDP never went up... performance at those TDP should go up.
so to with 3xxx and 4xxx yank powerdraw up to insane just is very bad gpu design.
and even before those pricespikes if you skipped a generation..
the 80ti and 70 of 2 generaitons apart would peform equal but that newer 70 would use more like 120w. aka halve. with that 70 always costing around 350 euro...
alternatively like the difference between two x80ti caards 2 generations apart was double performance while that newer 80ti also costed around 700 euro.. and used the same 250w
THATS progress
what we now have is not progress.
as traditionally 700 euro would get you the secondbest card and 1100 euro the best.. now an 4070S is already 700..
while the 4060 that used to be the lowend card... is now a midend priced card..
and lowend? you will be forced to get an 3050 to get anywhere near prices that are in that range..
if you factor in the cost of electitricy (50 cent per kwh).. and the risen powerdraw. than at the same budget.. peoples performance actually has dropped..
yes it's a more expensive production proces.. but we are years further.. production processes have gotten more efficient and thus cheaper.. fact is the cost to make the best chips has not considerebly risen.. as stated even a 4090 cost no more than 450 euro to make the entire card including the chip.
what has changed is : worse design causing in higher powerdraw...
or put different.. they have not invested enough in R&D and thus hidden marginal performance gains by yanking up powerdraw into the insane.. a proper chip would not need that kind of draws for that or even better performance..
while also their profit % has gone insane...
they can just do fine with similair budgets for R&D and similair performance gains, tdp draws and prices for products as they used to have upto 2016 for over 15 years...
they made a healthy profitmagin of 30% back than and if they returned to it they still would..
they just not make like 70% to 85% profitmargins as they do now... they are RAKING it in.. while not delivering the goods.
nope.. every top gpu upto the 2080ti.. from those made 20 years earlier used 250W..
every titan.. every x90, x85 or x80ti whatever for that series was the best card at the moment..
only with the 3000 series.. all of the sudden the titan started to demand 450w.. were before they always used 250w..
250w is PLENTY for the best card in a lineup.. it always was..
using 450w is basicly "we have not put enough in r&d to create a new chip.. that can do the normal increase performance by 40%.. keep powerdraw the same... lets just pump insane amounts of power trough it so it on paper at least has a bit of a performance gain even though it actually is nearly the same chip.
they maneged for many decades to every 2 generations double the performance.. while keeping the best card always at that 250w...
its the 3000 seriues that broke that tradition..
likewise the best card always costed about 1100 euro.. the secondbest 700.. that traditoon was broken with the 2000 series...
and not cause they have to.. they just have put less in r&d and yanked up their profitmargins.
My 1080 Ti used 310w.
did you have an overclocked model?
my 2080ti uses 250w.. the pump to cool it.. thats another story... but that single 2080ti uses 250.. so did the 980ti.,. and the titan black.. and the 580.. and every card I had that always was the best or secondbest lonmg before it..
you want to argue tdp =/= actual draw...
I grand you that as draw will matter on load..
but higher tdp = more draw..
and usualy max draw = tdp.
you want to argue that some brands sell factory overclocked varients..
I grand you that.. but again not the factory model nor the chip does that.
fact is todays chips have RI-DI-CU-LOUS powerdraw
and todays chips have RI-DI-CU-LOUS prices.. even if you correct traditional prices for inflation.
both show in nvidea's profitmargin and neth profit goine trough the roof..
the prices are NOT tthis high cause they cost that much more to make
and the chips are clearly not as well designed as they could.. basicly we see what intel used to do.. to expensive chips.. and very ltitle spend on r&d.. due the lack of competion..
you saw how much they held back.. when that competiion arrived suddenly the product leaped lightyears in progress.. and prices plummeteth... showing how much they overcharged and how much they did not sell what they could make.,.
ofcourse a good compagny does not HAVE to do this.,. even with essentially a monopoly.. nvidea could think.. I not NEED maximum profit.. I want to just give gamers a nice experience.. so I just take a nice 30% profitmargin.. thats enough.. and keep doing good r&d and charge reasonable prices.. so my gamers who have been my customers for decades are served well..