安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
So ultimately it depends on you.
I've been seeing you rag on the RX 580 for this reason multiple times now, and I think it's about time to put a stop to this nonsense for once and for all.
185 watts isn't the most efficient card, but you're looking at running about 113 watts anyway[www.amd.com] if you got say a 6500xt. It's not quite as good as a 6500xt, but the difference is less than 10%[technical.city], so we can consider them comparable.
Let's say electricity costs you 15 cents per kilowatt hour[www.eia.gov], then the operational difference in cost rounds up to $32 per year[www.calculator.net] if we assume both cards run at full load for 8 hours a day.
Right now the going rate for an 8 gig. RX580[www.amazon.com] on Amazon is $90 right now. An 8 gig. RX 6500 XT is more like $180[www.amazon.com]
If we assume you run the card for 3 straight years before upgrading, then the electrical cost of the RX 580 would run you $92, so you'd be spending $182. Under these conditions, you would've been better off with an RX 6500 XT, but only by $2 and about a 7% difference in performance.
Moreover, that's not a particularly realistic scenario. The odds are you're not on your computer 8 hours a day because that'd be every spare moment you have on a 9-5 work day, and even if you are, much of that time is going to be closer to idle, which should be more like 10 watts than 185. You couldn't possibly have a constant 72 watt lead at all times.
Also, if you're short on cash in the here and now, it's basically like taking out a payday advance to get you an entry level card, spread out over the course of years.
Realistically, even though electricity is going to eat into some indeterminate amount of the savings, you're still going to save a considerable chunk of money based on the difference in captial cost before both cards start to show their age and it is time to upgrade again.
The RX580 is still a perfectly reasonable choice unless you're some pro. gamer whose job it is to play video games 8 hours a day in the first place, then plays games for 8 hours more in his or her spare time but if that's the sort of person you are, then you shouldn't even be looking at cheap entry level cards in the first place!
An RX 580 isn't going to increase your electric bill by $50 month. Why even worry about it?
We were juding based on A.M.D's. specified T.B.P. for both cards before. If we're going to do this based on actual power consumption, then we need actual tested power consumption for both cards before reaching a conclusion. Otherwise that's not a reasonable basis of comparison, because the RX 6500 XT could be drawing more than the company claims too. Also, if we are going to draw general conclusions rather than compare very specific board partner models, we'd preferably do it based on testing of multiple models to come to an average and eliminate outliers. Even when testing specific models of the same product, manufacturing tolerances can lead to the same model of product being better or worse too.
But look, even if I concede the point with no adjustment to the expected power consumption of the RX 6500 XT, 198 watts is only 13 watts more than 185, so the power consumption differential only grows from 72 watts to 85 watts and the yearly power consumption cost goes up from 32 to 37 (rounding down). The difference is basically $5 per year, which adds up to $15 over the course of 3 years. That means instead of $182 it's $197 in the worst case scenario, and again, I see no circumstance in which you're realizing a full 72 or 85 watt power differential for 8 full hours per day.
We're not coming to a wildly drastic difference in overall conclusions here.
Also, hey, if we're going to be super picky about the specifics, let's not forget the extra money you save should be gaining interest in the bank before you spend it on the electrical bills. >_>
Well, first if an RX 6500 XT is a better performing card, even if only just slightly, then one could argue that if it is so much more energy efficient that you spend the same total cost between the capital purchasing cost and electrical consumption during the operational lifespan of the two cards, that you would have been better off just spending the extra upfront cost of an RX 6500 XT in the first place for the extra performance and features it provides.
Second, if you're running the card into the dirt over the course of a decade instead of upgrading within a 3 year cycle, then trying to save a quick buck now could hypothetically have you pissing money down the toilet in the long run.
Plus there's the environment to consider, maybe.
We really can't say without system specs. Preferably specs. that include your C.P.U. model number, your P.S.U. model number and R.A.M. quantity/speed. We'd also need to know your region to get an idea of fair market rate for pricing, unless you're okay with a recommendation based on U.S. pricing (which I believe is the most statistically significant assumption on an English speaking forum).
However, RX 580 is fine for starters. It meets the minimum specs. of basically every new game on the market and it offers stronger performance than any card you can buy for $140 or less, except for its big brother, the RX 590. This includes some newer and more expenssive models like the Arc A380[technical.city] and the GTX 1650[technical.city].
You have to jump up to at least $180 and start looking at cards like the Arc 580[technical.city] or the RTX 3050[technical.city] to get significantly more performance, although they do offer much more performance. Something like 40-80% more performance. Way better than an RX 6500 XT admittedly (the RX 6500 XT was just the first low cost model that popped into my head for comparison).
80% extra performance for double the price isn't price to performance parity though (falls short by 20%), and like the RTX 3050 is a 130 watt card anyway so my above calculations apply and the R580 consumes 194 watts, so it may even cost you even more to run it at full load. You can verify these numbers at the Tom's hardware review for the RX 580[www.tomshardware.com]. Granted, with 80% extra performance, it'd take a higher workload to run it at full load, but I'm not sure how to account for that.
Regardless, if you already have an RX 580, then I wouldn't begin to look at getting a new card at least until Intel Battlemage is released later this year and shakes up the market. You're fine, at least in the short run. The only games you won't be able to get running are the ones that require ray tracing.
Yeah, you're right. If we cared that much about nature and the environment we would've chosen a different hobby, like hiking or bicycling. Instead we're playing games while Hexus oozes about in our devices and sings Toxic Love. Wanna go choke on some smog? It's like smoking without the cigarettes! :-P