安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
Best bet, stick with a Nvidia GTX 970 or GTX 980 for now, sticking to 1080p or 1440p resolutions and simply maxing that out for now. Then wait for another 2-3 years at least.
Nvidia Pascal is delayed but will be coming out after 2016+.
The GTX Titan X is a mere tiny stepping stone compared to it. It will be 10 times more powerful! Seriously next gen stuff. With the NV Link and PCI-e 4.0 on motherboard, it can transfer 80GB/s over the bandwidth... that's fast enough to load an entire bluray disk by the time you inserted it into the drive. Completely overkill for today's monitors of 1080p or 1440p, etc. Would more likely be used to run UltraHD 4K and higher monitors off a single card.
Note: To clarify 10x more powerful - that's in terms of CUDA compute performance—definitely impressive, but don't expect the first Pascal card to actually deliver 10 times the gaming performance levels of the Titan X.
It however allows for crazy stuff - Allowing for up to 8-way SLI (with minimum performance lose from each card) through its NVLink tech and the ability to utilise up to 32GB of RAM, etc. Better, more realistic graphics.
Nvidia's next-next GPU chip, Volta, is marked down for 2018. Reports it will be able to "power a small moon" are said to be "slightly overstating things". However, they have really unbottlenecked a lot of possible power to come.
https://www.youtube.com/watch?v=Tb7ZYSTYHbw
Anyway all this is certainly exciting news. It looks like the next wave of GPUs are going to be monsters!
Thanks for the info.
Note: A single Titan X (OC) is very close in performance to the GTX 980 in SLI. It even beats it sometimes!
https://www.youtube.com/watch?v=CpmuXd52Ce0
Titan X in SLI anyone?
Yeah, it's a serious beast for what's currently available. As for price tag though... you will have to consider that cost to performance ratio.
It costs a whopping $1838.85 NZD or $1999 NZD (12GB version) here for one! That's approx $1500 USD.
Where as a single overclocked GTX 980 costs $890 NZD or less. Could get two or three of them in SLI. That's approx $670 USD x 2 for SLI = $1340 USD.
Maybe the pricing here is just crazy at the moment and it's a better price in other countries? I'm pretty sure we have some serious markup on them. Of course a single card would beat SLI depending of the game (better cooling / quieter / power usage / etc), but still over priced in my eyes.
Get the titan x. I have one now and another on the way. Its a bad ass card that will last a long time. A single card and can compete with 2x 980. They are clocking around 1500 with the current voltage locks but once it is unlocked they could be pushing 1700 with water. Demand is higher than ever and they sell out in minutes. The card will also hold its value like the og titan.
Using the "nextt thing" excuse is completely pointless. The next thing will always be faster and better. If you use that arguement then when do you upgrade?
That's why they joke the "Volta" cards will be able to power a small moon. It's a simulation of the moon and shading/effects from the craters all over it, etc. Nvidia debunked the moon landing conspiracies with new GTX 900, but simulatoring the Apollo 11 moon landing and all it's reflections... This is an example of using CUDA and VXGI. They tend to like the moon simulation for testing purposes.
http://www.nvidia.com/coolstuff/demos#!/apollo-11
ps: Showing once and for all that the moon landing wasn't faked photos, but rather merely realistic CGI computer generated at Area 51 :)
---
Nvidia in 2014, removed a 12 year old graphic bottlenecking bandwidth. This was previously capping off performance advancement.
Consider this against the future GPU Roadmap, it's a huge leap...
Memory Bandwidth:
http://oi58.tinypic.com/n2ga48.jpg
Nvidia GPU Roadmap:
http://oi62.tinypic.com/124e9lw.jpg
Around 2016 - Intel Skylake motherboards, with PCI-e 4.0 (Intel and Nvidia worked together on this), DDR4, etc. Coupled with Nvidia Pascal and NVLink, etc. Then on that DirectX 12 arrive for even more direct graphic calls, etc. Current PCs are most likely going be slowly appear stone age in comparison.
It's kinda like when motherboards previously move from DDR2 to DDR3 and PCI-e 3.0 (16x) graphic card ports on next gen motherboards where first introduced by Nvidia. Things just started to double in performance and slowly become norm over time, with higher resolutions and better gaming to match. Of course this doesn't happen overnight.
---
"Using the "nextt thing" excuse is completely pointless. The next thing will always be faster and better. If you use that arguement then when do you upgrade?"
No, I work upon a Tick/Tock development cycle (Intel, Nvidia, Microsoft, and most other major companies all follow this) and actually engage/contact those companies as a developer myself.
For example, I would personally ignore the Intel Broadwell (5th Gen) motherboards, waiting for the Intel Skylake (6th Gen) to be released. Why? It's not because the Broadwell is bad... far from it. But it's a mere stepping stone and will be shorter lived. It's just a Tick to the Tock.
---
All that said... ask yourself, what monitor are you wanting to run? 1080p, 1440p or UltraHD 4K? 60Hz or 120/144Hz? A single monitor or Nvidia Surround (3 or more monitors)? Then what you need to get to achieve those results?
I just think its silly to downplay the tx because of future technology.. which we have no idea how it will perform in real world. For gaming, it is a much better card than the og titan was at release.
This is going to be the year of awesome gaming monitors too. 1440p 144hz, which a single tx is perfect for, will be the standard for gaming in the market.. and don't forget 21:9 gaming monitors. There is no way I could make it till the end of 2016 after the XR341CK is released.
You only need it if you're on high res. The 9xx series doesn't cut it with 4gb vram (and the 980ti won't either with 6gb). A large portion of newer games are pushing 5-6gb vram on 1440 and up.
All black is sexy though :)
Report back if you overclock.
If the pricing for you there is cheaper, then it might be more worthwhile. Just not for me.
Lord of the Fallen uses about 6.2 when set to max using 1440P and Mordor about the same depending on the area.
Being able to maintain a high FPS level at 1440P without the headaches of SLI makes it worth the purchase for me at least even though I admit it is priced too high.
Looking forward to testing GTA V with the card.
https://www.youtube.com/watch?v=kugBkHhPeNM
50 to 60 frames maxed out @ 1440P.
While I can't test out Crysis 1 it is the first card that can run Crysis 3 maxed out (SMAA) @ 1440P with no issues so I'd imagine it will breeze through the original =P