I have Intel DZ68DB motherboard with buld in DVI and HDMI and displayport output. CPU is i7-2600K. OS is Microsoft Windows 7 x64 Ultimate SP1.
BIOS set primary display to DVI, secondary display to HDMI.
For weeks, this two display work fine together.
One day, I powered the system without turn on the HDMI monitor.
Ever since, I am unable to make the system to detect the presence of HDMI display.
Yes, I done the device manager - device driver disable - device driver enable. Did not work.
Yes, I done the control panel - display - change display setting - detect. Did not work.
Yes, I done the Intel Graphic and Media Control Panet - Multiple display. Only detect one display.
Your advice is much appreciated.
Yes, I disconnect the DVI display and boot with HDMI display connected. Yes, the system boot. But HDMI display is black.
Intel HD Graphic Processor 3000 Video BIOS 2111.0
Device Manager - Display Driver - from Intel - 2-14-2012. Driver version: 126.96.36.19953
Intel DZ68DB motherboard BIOS version DBZ6810H.68A.001.2011.0401.1616
I am not sure, but I have the same problem with a Gigabyte Motherboard (H61M-USB3-B3, revision 2.0) and the Intel Core i3 2100 processor which has the HD 2000 controller.
VGA port works fine, but the HDMI and DVI ports have the bug where they work fine during the initial boot / Bios screen, and while the Windows Logo is preparing to start windows, but as soon as the Intel driver activates, it turns off the connection to the HDMI and DVI connectors.
I am using a TV as a monitor in my setup, it has a HDMI port and VGA port. So my work around is that if I need to watch a bluray on it (which requires HDMI or DVI), I use the VGA connection to Disable / Enable the HD 2000 adaptor. When I do this the Intel Driver then picks up either the HDMI display or DVI connected display (I have a DVI to HDMI cable). And it works while I don't put the machine into sleep or reboot.
But that's more of a hack. Intel Engineers have reported that its a known issue. Some seem to think its certain displays not performing the HDCP (security handshake) properly during the boot. I never had this problem with my previous setup which used an ATI 2600 video card.
After 38 days, I finally got a little device from China, which I purchased because my new Bluray Player also wouldn't talk to the TV via HDMI.
The device converts the HDMI to a VGA output along with stereo audio. It supports HDCP 1.2 protocol.
And what I found is that the HDMI port stays active when booting windows via this Device. The only problem is that the device is limited to input resolutions of 640x480, 1280x720 and 1980x1080. So the BIOS screen shows fine, and if I configure windows to use 1280x720 it displays via HDMI (but the Vga signal is a bit to long, the taskbar is below the screen). But the test proves that the HDMI port works fine with a HDCP 1.2 compatible display.
My TV obviously doesn't support the latest HDMI handshake, so that's why the Intel Driver is turning it off. The VGA output using the convertor is not as good as using the straight VGA cable, and definately not as good as when I use the HDMI cable straight into the TV's HDMI port.
So if Intel can support old HDMI connections in their drivers, then our HDMI issues should go away.
After some more testing with the HDMI to VGA converter, with an older VGA LCD screen, I have found that the HDMI converter works with more input resolutions then just the three I metioned.
It works with all output resolutions from 640x480 to 1920x1080. So its my TV that only likes the 640x480, 1280x720 and 1920x1080i resolutions from the converter.
The LCD display seems to be more happy with the output from the converter. Once more I tested both the HDMI and DVI output and they all work fine with the Intel driver, thanks to the HDCP 1.2 support of the converter