安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
also for complex calculations (non gaming) it still can be usefull.. 2x3090ti in sli likely still gives you a sliver more raw power for your rendering tasks.. than a single 4090... but that be over with the 5000 series..
having 2 grafic cards will however remain VERY usefull..
***you can for example have 1 card fully used on game performance while the other one is recording and streaming... 2 different tasks each with their own cpu.
***you can also run a virtual machine and allocate to it one gpu... while you still have a grafical output trough the other one..
**you can even run TWO virtual machines... giv each 1 card, some of the ram and some of the cpu cores and thus have 2 users use a pc on just 1 case... they can even langame against eachother...
-> and given cpu performce rarely is the bottleneck it might be vastly cheaper than 2 seperate computers...
those are just 3 situations where running 2 different grafical tasks at the same time can be beneficials..
Yes that would work. It has a drop down menu at the bottom of the 1st page on GPU-Z for switching GPUs, then apply a load on GPU to see if the PCIE mode changes. As it can drop down to X4 when idle
If you were using a multiple monitor setup that might also make sense. My understanding is that each monitor independently uses the resources of the graphics card to which it is directly connected, but there might be some fiddly caveats of which I am unaware.
S.L.I. was dead on arrival to the best of my understanding. A higher tier G.P.U. usually performs better than 2x S.L.I. of a lower tier card for less money. The scaling only really makes sense once you run out of G.P.U. tiers to which you can upgrade if you are a member of the aristocracy. Maybe it also as an upgrade path (buy the best card you can afford now, then another one of the same card when after it significantly depricates in value instead of buying a new card).
Even when it would make sense for the money, it's a bit of a more... inconsistent experience for the same F.P.S. averages. Think of a strong man being strong enough to lift a barbel, only to clumsily tripping on his shoelaces and drop it anyway while doing so of all things. XP
It never really became a popular system configuration for these reasons, which is likely why Nvidia killed it.
but still not cost effective vs buying a single better gpu
each gpu would draw alternating lines, and combine them when sending to the display
nvidia has changed the sli method twice since then