Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
Again, all thoughts from someone who doesn't know enough to have a formal opinion.
My 7800 XT (presuming it works out in the end; it's on its way back from RMA now) is pretty high end compared to what I typically go for, even if it really only falls around half the performance of nVidia's best, so I would definitely be able to sit with it for a while, I think (I mean my GTX 1060 lasted me around seven years, and while it is showing serious age now, I can't complain). Especially because most games don't need anywhere near the level of hardware most enthusiasts have. Not playing on the highest resolutions and not demanding the highest frame rates works in my favor too.
They were talking about this as well. Intel is suspected to potentially make such a move first, and then AMD would need to decide if they want to respond by doing the same or not. I'm not an expert but found this part of the discussion fascinating. I like such theory crafting.
that the xx70 was the most bang for your buck
the xx60 usually did cost 60% of what an xx70 cost, but performed only 40%
the xx80 usually did cost 150% of what an xx70 cost, but performed only 120%
the xx80ti usually did cost 200% of what an xx70 cost, but performed only 140%
the titan cards (what xx90 actually are), did cost 300% of what an xx70 cost but performces only 150%
if we look in 9xx untill 40xx we do see that the xx90 and xx80ti have WAY more increased in price relatively than the xx70
but prices have massively spiked around the board
an 4060 cost now around 300 euro, that a 50% increase vs the 200 euro xx60 used to cost
it's pricepoint is now 50% of an 4070
this means it costs 20% less than it historicly should
an 4070 costs now around 600 euro.. thats a 70% increase vs the 350 euro xx70 used to cost
an 4080 costs now around 1200 euro, thats an 140% increase vs the 500 euro xx80 used to cost
it's pricepoint is now 200% of an 4070
this means it costs proportionally 34% more than it historicly should
an 4090 cost now around 2100 euro, thats an 100% increase vs the 1050 euro titan used to cost.
it's pricepoint is now 350% of an 4070
this means it costs proportionally 17% more than it historicly should
so we have an average price increase of 70% for gpu's in just 5 years.. while before prices had stayed stable for nearly 15 years..
but at the same time the xx60 has been made a bit more affordable while the xx80 and xx90 have made less attractive..
we all know the hated cryptoaholes are to blame for the first one (drop the price of gpu's back to much or yank up performance to much without a price increase, and cryptocraze 3 will start.. it will easely take 15 years to tripple production capacipty of gpu's to meet that new demand.. so until than.. all gpu producers can do is keep price per performance the same, meaning each generation needs to get proportionally more expensive.. to keep those cards not too attractive to miners.
the second one however.. likely is due the death of SLI.
-> in the days of SLI.. a person could always pick 2 of the lesser cards... than 1 more expensive one...
it did not scale perfect 2 cards performed equal to 1.6x 1 card.. prices of the better card still had to keep low enough to not make just go sli the cheaper option.
if the 5xxx series will give such a performance boost as rumoured... it likely will mean either a massive price increase or another cryptograzy meaning no gpu's for us gamers at all.
neither will be good.
This is wrong on pretty much everything. I'm willing to state maybe pricing differed that wildly wherever you're from (though I'm questioning even this...), but the performance comparisons are still way off base then.
In what generation did an x70 add 100%+ performance to an x60? In what generation did an x80 add 100% performance to an x70?
Using Pascal and US pricing as an example here...
The GTX 1060 had over half (and really it's nearly two thirds at ~63%) of the performance of the GTX 1080 and cost a bit less than half ($250 to $600). The GTX 1060 had half of the performance of even the later GTX 1080 Ti which was an additional $100 more on top of the non-Ti pricing (so double the performance for well more than double the cost).
Even if pricing in your region differed, you didn't get different versions of these chips that had larger performance gaps unless we lived in different realities or something.
Sources...
TechPowerUp[www.techpowerup.com] (use the "relative performance" to compare it to the GTX 1060, GTX 1070, and GTX 1080 Ti).
Tom's Hardware[www.tomshardware.com]
Even if you don't like those sources, you have to find numerous better ones that suggest those two sources are just sooo far off the mark and that the real gap between tiers was over twice what they are shown here. I'm doubting you'll be able to find this, for obvious reasons.
Where are you seeing 100%+ performance gaps between x60s to x70s, and then 100%+ performance gaps from x70s to x80s!? You're lucky if that's the total difference from x60 to x80, Ti or not! Not in Pascal. Not in Turing. Not is Ada. Maxwell was an outlier where the x70 was at its cheapest and the x60 was relatively mediocre (and even then the GTX 970 had its issues, like the whole VRAM situation), but Maxwell was an outlier, not the "typical", in your words, trend. Go back further to generations before Maxwell like Kepler, Fermi, or even further back and the gaps are similar to the ones I'm describing.
People will do anything including straight up lie about factual performance to claim this current generation isn't an outlier and it's wild...
the "normal pricing" started around the 300 or 400 series.. and ended with the 900 series... (already with the 10 series prices were a bit increased)
and in EVERY ONE of these series..
an titan was 1050 euro
an xx80ti was 700 euro
and xx80 was 500 euro
an xx70 was 350 euro
an xx60 was 200 euro
there was ofcourse a little deviation.. like in one gen that 80ti was more 680 in another more 720..
so add +/- 10% deviation per gen on these prices
but they pretty much were the same for a long decade.
and while I do think its normal an xx60 as budgetcatd performs less than half for more than half the xc70 as midrangecard.
(and in fact in this 4xxxx it is an oulyerr.
the 4060 only cost 50% of an 4070 npt 60%
and it performs 50% (in 1440) to 60% (in 1080) of an 4070 which is higher than the 40% it should.
this gen 4060 is an outlyer in that is cheaper and performs bettrr than it should.
Even if you add 10% deviation to your pricing, you earlier gaps make no sense. I wasn't nitpicking about 10% deviations here. I was asking why you were claiming there was a doubling of performance and price where there was typically half of that at best. That's not nitpicking; that's a "what planet were you on?" situation.
You're quoting 100% gaps in pricing and performance going from the x60 to x70, and again from x70 to base x80?
The only tiers where there was traditionally a roughly doubling of performance (plus or minus) was between the x50 to x60. That's it. To get another doubling of performance, you would have had to compare the x60 to x80 (and often times the x80 Ti), not the x70.
The x70 was usually was a worse value than the x60, and less performance and sometimes also less value than the x80. So it usually made sense to just skip and go for the x80 if you wanted more than the x70, but yes sometimes budgets force us to accept the lower value. And that's fine; the x70 serves its role, but I'm not sure why you're claiming that role was better value than the x60. That was almost never true.
See Ada for reference. The RTX 3060 Ti and RTX 3080 were good values. The RTX 3070 and RTX 3070 Ti were not (and they had 8 GB VRAM too, which was seen as too little for the times and for the level of performance they offer).
Same with Pascal. The GTX 1060 offered 75% of the performance of the GTX 1070, but cost 65% of the price.
Maxwell is the only likely exception that comes to mind, but both the x60 and x70 were compromised here (again, since we're discussing past generations so the current one being another exception is out of the picture here).
I interestingly have the opposite opinion, and I used to buy precisely the x60 products and I used to like them. But the more I look at modern x60 products, the more they appear to be less than they used to be.
They used to be good, but the RTX 4060 in particular is somewhat mediocre. It's cheaper specifically because it has to be; because it has a low generational performance uplift, and it loses VRAM and bus width. It's too compromised. It's a glorified x50 product.
I know the RTX 4070 isn't a sufficient answer for everyone, since many people have budgets below this point, but it really feels like the lowest you can "blanket recommend" among nVidia's current offerings. It fills the spot the x60 used to. You used to be able to blanket recommend any x60 product and never feel wrong. Now the x60 feels bad to recommend. It's an "only if you can't save for an RTX 4070" solution, and that is by design. nVidia designed this lineup to push mainstream users up to the x70, and enthusiasts up to the x90. It's an upsell generation because they couldn't offer the traditional uplifts without cannibalizing the previous generation which they had too much of an oversupply of. Maybe the RTX 50 series better (and I hope it is!), but this generation is certainly an outlier compared to past norms.
The RTX 4060, RTX 4070 Ti, and RTX 4080 are all relatively mediocre values to me. With the RTX 4090 pricing going up, the RTX 4080 starts to look a bit better, but I'm hoping the SUPER refreshes fix both the RTX 4070 Ti and RTX 4080s bad offerings. Unfortunately, the x60 tier will not get a SUPER refresh (yet?) so the RTX 4060 will remain somewhat mediocre (at least for now).
as a general rule one replaces ones gpu oncececery 2 generarions in the samecquality bracket .
so 680ti than skip 700 series buy 780tu skip 1xxxx series buy 2080tu skup 3xxx seeirs buy 4080ti
or 570 skip 6xx buy 770 skip 9xx buy 1070 skip 2xxx buy 3070 skip 4xxx
and the low end catd is the xx60.
so each geberation 50% of all gamers should be looking to replace their card.
and nobody should still game on an 2xxx seriea or older.
(cept ones on 2080ti as the 4080ti has not come out yet)
nor should anybody be useinh an xx50 or xx50ti
we should only see 3060 and up and 4060 and up.
but yeah the price spikes hace made it exoensive to stay in your bracket.
replacing an 700 euro 980ti for an 1600 2080ti did hurt.
and stepping down to a mere 2080 and still pay 1200?? never! I so hope when the 4080ti releases its priced better..
and yeah I liked it much better whenxx80ti did cost 600-700 euro.. not more than a titan used to cost.
You should replace something only when you want more performance.
And you should replace it with the best thing within your budget.
It doesn't need to be harder than that.
That's nonsense.
People should use things as long as they prove sufficient, and there's no need to buy more than they need, nor sooner than they need to. Using an x50 is fine, and sticking with a card more than every other generation is fine. Why waste something if it's still proving useful?
Conversely, this goes other way. Maybe you have something only one generation old and already want more. Are those people wrong, too? No, they're not.
Get what you want, when you want it, and only then. Again, it's not harder than that.
I thought the same. I keep GTX 680 since 2013 with no issues until this year when I got Battlefield 2042 and had security update expired. Nothing wrong with that, not even for my pc with games I have.
I kept my GTX 1060 until this year. I'm actually using it right now as I wait on the other to come back from RMA.
these rules are what defunes?agamer have done so since the 80s.
sure you have the "filthy casuals" but those aint gamers ....;)
1 you deciede whether gaming is your main hobby. if no.
dont invest in pc get a console.
2 you look at what your montly budget allows.
(45, 75 or 100+ euro per month)
3 you start setting apart that amount each month consuder it a fixed expense. like rentdo not touch what you set aside for pc for anything else.
4 after you reached your start amount you buy yoir forst system.. from than on you follow your ubdate scedule loyally. and keep setting asude the same nonthly amount.
you are allowed to delay a replacement or upgrade 3 months if a launch of a new series is near thats only logical but not 6..
that way you keep costs in your budget and keep gaming nicely.
better be a permament low end gamer with often replacemnts that but 1 mid end pc and not replace it for a decade.