Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Maybe the adapter? or maybe your connecting dual-dvi to dvi and it just doesn't work? Maybe you have Displayport? that would be a better way to connect.
All I can think of right now.
I have two adapters (my previous GPU had 2 DVI ports so I bought 2 adapters for my HDMI cables), and switching it doesn't seem to make any difference. I would be surprised if it did, because 1 minute before installing the new GPU I was using both of them with the old GPU and both worked fine.
I'm 100% sure I'm not connecting dual-dvi to dvi, yes. My monitor has a DisplayPort, but reading on the internet everyone is having issues with DisplayPort on GTX 970 and GTX 980, so I'm kind of afraid it won't help. I've ordered a DP-to-DP cable from amazon and tomorrow I should be able to test it.
It's the first time I use an NVidia GPU (no particular reasons, just that every time I had to buy a new GPU the best price/performance ratio was on a Radeon in those particular moments) so I'm a bit in the dark as to where to fiddle with the settings.
Did you find a solution for this problem yet? I am having a similar same issue on a GTX 980 card. Whichever monitor is plugged in HDMI works fine, but whatever is plugged into DVI will not work until I physically unplug the cable and plug it back in one the machine is booted. Both monitors worked great with my 780, but as soon as I swapped cards, everything DVI stopped.
Ensure your graphic card drivers are up-to-date, full installed and fully working.
Are you using DVI-i (analog and digital) or DVI-d (digital only) port?
Note the difference between them. Ideally you would want digital signal, so both can do, unless your monitors are running analog only (or it's trying to use that rather than digital). I think all GTX 980 have a DVI-i, therefore make sure signal is using digital (if the monitor can toggle).
Display port shouldn't have issues, using that. Then again all are working fine and 100% on mine (Asus Strix), it might depend on your model. What port you use shouldn't matter at all, until you get to Nvidia Surround (3 or more monitors), then the website itself suggests what connection layouts are possible (6 of them). This is probably what people are complaining about, they are just unaware the center monitor needs to be connected to the first card in a SLI, etc. Shouldn't affect you with just two monitors.
Adapters are the main concern, if possible go direct. HDMI to HDMI - or Duel-DVI cable - or Display Port cable. Display Port is meant to be the best. HDMI and DVI compare the same, except HDMI also carries audio signal to the monitor, something which 99.9% of users won't need anyways. It's best to use Duel-DVI cable or Display Port in most cases, no adapter/converters (it's only like $15 max per cable anyways).
http://en.wikipedia.org/wiki/Digital_Visual_Interface#DVI_and_HDMI_compatibility
See paragraph after bullet list.
I was thinking about getting a GTX 970, but was not sure if my PSU or PC from 2010 were up to it (PCIe version? No UEFI), so I just got a GTX 750 Ti for now. It uses half the power of my GTX 550 Ti, so my entire PC uses little more total than the 145 watts of a 970 alone (i5 650 3.2 GHz w/GTX 750 Ti uses not much over 150 watts max AC input on Kill A Watt meter).
I bought a DP to DP cable, to connect my second monitor without any adapter, but it didn't fix the problem. The second monitor started working when, after a few days, I had to make a skype video call and I turned on the webcam (which is integrated in my first monitor): when I switched the webcam on, both screens turned black, then flashed, then suddenly both worked at the same time.
I don't know why, and honestly... at that point I didn't care why, I was just glad that it was working! Since then (beginning of December) everything seems to work fine, I was a little hesitant to install the latest update to the NVidia drivers so I selected the "clean install" options to be safe, and things are still working.
go to the nvidia control panel and set the main monitor to #1
Thank you!
YEP! ( :
Nvidia Control Panel > Display > Set up multiple displays
There was a box that said "select the display you want to use" and some how my one of my monitors box was unchecked.
Thanks!
however if connected via HDMi it works fine/Good.
I have used DVI and HDMI as dual monitors for 10 years now, and with this new card.. I am stuck with 1.
The way it drops off when i hit the login screen, tells me it could be a driver issue?
Has anyoine actually fixed this?
Check your Windows services (if you're on windows). Look for services that deal with like user manager, user profile service and stuff like that.
I ran into a situation similar where immediately after logon my monitor would disconnect and I'd have to disconnect the second monitor and reboot in order to use my primary monitor. I would have to format/restore in order to fix it. It was that serious. Spent some time and ended up realizing that normal default Windows services didn't cause this. Reason it kept happening was because I did massive service tweaking and disabled like everything I knew I didn't need. I think I had to just stop touching a couple of services that had to do with like multiple users or something. Something weird about my setup is that in the multiple monitor config the left one is 2 and the right one is 1 always and I hate that but doesn't really affect me too much.
Also, just to be sure the second monitor IS on and on the Input of PC output and not powered off and you just can't see it set as the primary monitor display that you can't see since it's powered off?.