Steamをインストール
ログイン
|
言語
简体中文(簡体字中国語)
繁體中文(繁体字中国語)
한국어 (韓国語)
ไทย (タイ語)
български (ブルガリア語)
Čeština(チェコ語)
Dansk (デンマーク語)
Deutsch (ドイツ語)
English (英語)
Español - España (スペイン語 - スペイン)
Español - Latinoamérica (スペイン語 - ラテンアメリカ)
Ελληνικά (ギリシャ語)
Français (フランス語)
Italiano (イタリア語)
Bahasa Indonesia(インドネシア語)
Magyar(ハンガリー語)
Nederlands (オランダ語)
Norsk (ノルウェー語)
Polski (ポーランド語)
Português(ポルトガル語-ポルトガル)
Português - Brasil (ポルトガル語 - ブラジル)
Română(ルーマニア語)
Русский (ロシア語)
Suomi (フィンランド語)
Svenska (スウェーデン語)
Türkçe (トルコ語)
Tiếng Việt (ベトナム語)
Українська (ウクライナ語)
翻訳の問題を報告
I lol'd hard at this guy, last time I checked I paid $215 USD for my 3570k
Love how people make stuff up
Though it kinda feels like Nvidia's Titan, it's way to expensive for most people, and the price performance doesn’t scale, but its arguably the best GPU out there, and if you have the money you'll get it, if you don't no chance in...
That being said I would like to see benchmarks on AMD's new cup and compare it to Intel's i7's. Also I hope that AMD will be able to use what they have learnt form this to make a more affordable 5GHz CPU that will make Intel run for their money by this time next year.
Check it out vs a ocerclocked 4.4 ghz i7 3960x extreme in the gaming benchmarks:
http://www.kitguru.net/components/cpu/zardon/amd-fx9590-5ghz-review-w-gigabyte-990fxa-ud5/18/
I assume you're going to class at this point in electronics. Keep in mind that dissipated power in any circuit is dependent upon the current and the voltage. You can't have power dissipation without both. I'm glad we agree on that and the higher core voltage you apply to the CPU, the higher the associated power dissipation is going to be. That's where this whole conversation started out with these newly marketed CPUs running at a 1.65V core voltage.
http://software.intel.com/en-us/blogs/2009/08/25/why-p-scales-as-cv2f-is-so-obvious-pt-2-2
These new AMD CPUs are just an attempt by AMD to provide something that is comparable to Intel with their Extreme series of CPUs.
http://www.cpubenchmark.net/cpu_lookup.php?cpu=AMD+FX-8350+Eight-Core
The intel chips just slightly above and below are more expensive
Primitive technologies are always less expensive to produce and therefore can be sold for less. Besides pretty much everything above the 8350 on that list is a Xeon or an 2011 socket chip, the Xeons can't be compared to an 8350 as they are for heavy computing not general usage and gaming and pretty much any 2011 socket chip will obliterate anything from the AMD camp, and that's not even taking into account the ivy-E series which should be 30% faster tha current E CPU's and be around the same price(after launch inflation subsides). Your claim of 5 times more expensive only holds true to the Xeon chips which in terms of sheer computing power cannot be matched by anything AMD can produce that includes the 9590 with its silly clock speed.
Pay more = Get more
Thats the way the world works, that's the way it will always work. Just look at the top of the list for proof.
There comes a point where you get much less bang for your buck and no matter how much money you throw at it, you will get only marginally more. Since I am on a budget, the 8350 seems like a great bang for buck ratio, and it doesnt run on that silly 220 TDP.
Also from what I can gather from the chart, the 8350 way better than the 3570k for the same price
http://www.techspot.com/review/670-metro-last-light-performance/page6.html - no it's not.
These debates were held countless times on these forums. Yes AMD's CPU do number crunching better than intel and that's about it. And in the other case when Rove suggests AMD cards just because they have higher GFLOPS, which only means that those cards can do number crunching better than nvidia, when you look at the real world benchmarks the condition is different. I'm not saying that AMD cards are bad, they would perform far better than nvidia if AMD released a proper driver for once.
http://www.techspot.com/review/642-crysis-3-performance/page6.html - in this case, 8350 on stock clocks is closer to 3470, and when benchmarking CPUs in games, ever 1 ~ 2 fps count, because it's your GPU that does most of the job. Just ask around Planetside 2 community and how their CPUs perform. (Because even 9800 GT can run that game on ultra, but you need a good CPU for larger battles).
http://cpuboss.com/cpus/Intel-Core-i5-3570K-vs-AMD-FX-8350 - another comparison
Overall that site has some really weird results, some CPUs score abysmally high while some score really low. I remember seeing Q6600 scoring something like 7k on that site, which is crazy, I hope they fixed that, just so they don't look so silly anymore.
To cut it short, real world benchmarks is all that matters, you can stick your numbercrunching where sun doesnt shine if that cpu is going to bottleneck new graphics cards or simply suck in games.
As for all these 3rd party reviewers. While it is illegal to advertise your own product as something it's not; It is not really illegal if someone else does it for you as long as you are not paying them to. Who knows the difference as long as you do not get caught paying them off?
Many of these 3rd party reviewers spit out numbers using the latest and potentially most buggy games. Mostly I have stopped paying attention and stopped believing anything some journalist who know's where in the world is saying on their review. The numbers don't add up at all.
I get 60+ FPS in all my games unless the software is glitchy.
The PS3 is only 153 GFLOPS and spits out playable framerates. The Xbox 360 has less GFLOPS and is still playable.
Yet some stupid or corrupt reviewers claim that 1500+ GFLOP GPUs with decent CPUs are not even getting playable framerates in "crysis 3" and other newer console titles while 4000+ GFLOP parts like the HD 7970 GE or GTX Titan are "struggling" to get some absurdly low FPS under 60 that is "barely" playable.
If the games were actually programmed that poorly and the "optimization difference" between console and PC code compilers is that bad the games should never have been released and they should have spent that time re-writing the PC compilers instead.
It's all a bunch of lies I say!
Use your own eyes, and brain.
Intel for offices, again and again.
http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/2
http://www.anandtech.com/show/6201/amd-details-its-3rd-gen-steamroller-architecture
I don't think trusted review sites try to skew the numbers.