We may need to have additional information to diagnose this issue, for example...
what kind of cable are you using to connect the second monitor to the system?
what happens if you use the driver that is provided by Gigabyte, instead of using our generic driver?
have you tested the second display only (the one that you are having issues with), not using the display that remains active all the time?
is the latest BIOS installed in the motherboard?
Thanks for your help, Diego!
One monitor is connected via VGA cable, the other with DVI.
I have tried both your driver and the driver provided by Gigabyte. The most up to date driver on their Web site is the same version as the one provided with the motherboard.
The first monitor -- the one the display defaults to, killing the second monitor -- is the VGA. The DVI monitor works perfectly if I unplug the VGA from the mobo and reinstall the driver (I'm using the second monitor right now).
The BIOS is up to date minus one version, but the newest version is beta and there are strong recommendations on the Gigabyte page to *not* update the BIOS unless there is a clear system issue... I'm not sure this warrants that.
Okay, so just to set this straight, if you stop using the VGA display completely, and then start using exclusively the display connected through DVI, you re-install the video driver and the DVI display works.
Now, could you please confirm that the display through DVI will continue working even after restarting the system, waking from sleep or shutting down the computer? (as long as the VGA display remains disconnected)
EDIT: I thought the DVI monitor was working -- but it turned out I had *no* drivers installed.
If I have the Intel driver (4000 from the Intel site, or the driver from Gigabyte) and the VGA monitor unplugged but the DVI monitor plugged in, Windows will still only use the VGA monitor -- the DVI monitor will be completely dead.
What is the model of the VGA and the DVI displays being used?
What type of cable are you using to connect them to the system? is there any type of adapter or converter in between?
By any chance, could you try using another hard drive, and try a fresh operating system installation using only the DVI display through the installation, and install the drivers that come with your Gigabyte motherboard (not the one on the web site but the one in the setup DVD/CD that comes with the motherboard) (Just do not use the VGA display at all). That would be to isolate the possibility of a corruption with than operating system install you have, and leaving VGA completely out of the equation
The VGA is a HannsG HZ201D, running on a VGA cable with a VGA extension cable plugged into it.
The DVI is a Gateway LP2207, running on a straight DVI cable.
I'm not super tempted to try reinstalling Windows after several days of setting it up; my other drives have GPT partition tables so I'd have to reformat them entirely to use them to set Windows up again. I suppose I could try repartitioning my current drive for a fresh install, though. This is the sort of thing that puts me well into the yellow zone of comfort, if not red. Anything else I could try first?
(Edit) Putting it another way, spending $50 on a cheap PCI-e video card might be a more effective use of my time than spending 8+ hours messing around with partitions and Windows.
Message was edited by: Pants Masterson
In such case, using another partition would be a good option if there is no spare drive to test. The reason I ask this is because this is completely different from the black screen issues that have been reported, in such issues have been actually reported for HDMI, not DVI.
One thing you may test, would be the check what you see in the Intel(R) Graphics Media Accelerator control panel when both video displays are connected, and then yo try to enable Multi-monitor as shown on these articles:
I've run into the same problem (surprised to have found it so similar):
My previously main monitor is connected to my main graphics card via DVI.
My second monitor is connected to my integrated intel graphics adapter.
I had never used integrated graphics before but since my graphics card does not support two (DVI and/or VGA), I wanted to use the integrated one.
The only way I got the integrated one to work in the first place was to choose the corresponding option from BIOS to run integrated card FIRST (option something like IGCP or something).
The funny thing is, while navigating bios, both monitors were running fine. That is also the only reason I still keep trying to make it work.. it almost worked! well... it did work! Just not in windows it didn't. the problem arouse when windows started loading, and didn't resolve afterwards. Installing drivers for the integrated card didn't help either.
My main monitor is never completely blank though. It has a white rectangular icon at the upper-most line that looks like something that could be displayed in DOS as command line cursor (although its width is about 4 times greater than its height). The symbol is not blinking. Windows can not detect the DVI monitor that used for years, until I plugged in the VGA one into the integrated chip!
I also have gigabyte GA-Z77x-UD3H, running bios F17. I have a gigabyte graphics card and i5 processor. The system is 64bit Windows 7.