I should add that the UEFI BIOS on my motherboard includes a module that graphically shows what's connected to each port on the motherboard, and it DOES detect that there is a cable connected to the other graphics ports. I'm thinking that it must be some sort of compatibility issue between the monitor and either the graphics driver or graphics hardware, but yet it doesn't even work in BIOS mode, which I thought would default to a simple mode, and the monitor has never had a problem with other systems.
My frantic googling today found a few references to a few people who thought that Intel's implementation of HDCP in the HD Graphics chipset may be flawed, or at least finicky when interacting with devices having quirky implementations. This makes sense to me, considering that HDCP is the one "feature" in HDMI whose entire purpose is to cause the entire product to fail if something goes wrong with it.
I might have to just return this monitor and get a new one... I've hooked this laptop up to a couple of different televisions with no problem. If the issue does turn out to be HDCP, I'm definitely making a donation to Defective By Design after this BS.
The issues that have been reported usually show when the operating system has booted completely and the driver has been loaded. However as you mentioned, this even happens at BIOS level.
Still, the HDMI and DVI portion may not be functional at BIOS level depending on the settings done by the motherboard or computer manufacturer, however the ports should work when the operating system is used with no drivers or on safe mode according to the issue description provided by other users.
In this case, you should try using another video display if possible and try a high quality cable for your system.
I've tried with two DVI cables, and an HDMI-to-DVI cable.
I've also tried disabling the graphics driver in Device Manager with no results, other than the VGA display switching to a different resolution. I haven't tried in safe mode yet.
I don't have another computer monitor to try. I did try briefly, before returning the first motherboard for exchange, with an LCD TV with an HDMI input. I got no signal there either, although I didn't think to set the output to a compatible resolution beforehand, and I"m not sure whether the TV can adapt to non-video resolutions. I would have thought that the TV would've identified its capabilities via the HDMI cable, however.
I'm not sure what to make of the suggestion that the cause may be an HDCP handshake issue. I would hope that HDCP wouldn't come into play unless I was actually attempting to play protected content, which I never have done on this computer, however I have no knowledge of whether this is actually the case.
I'm sorry for digging up such an old thread but I seem to be experiencing the same problem. I have an Asus motherboard with an HDMI and Display Port output and a monitor with DVI and VGA inputs. If I use a known working HDMI to DVI cable to connect the monitor, the monitor never comes out of standby, as though it's not receiving any signal. This is the same during boot up and in Windows. So I can't even get into the BIOS. I have to use a discreet graphics card to be able to see anything. Using that I have ensured that the integrated graphics are enabled.
However if I connect he monitor using an HDMI to VGA adapter I do get a picture. So the integrated graphics are working but for some reason the HDMI output doesn't seem to like my monitor's DVI input. Unfortunately the picture quality using the VGA input is very fuzzy. I know the DVI input on the monitor works fine.
I've tried using the Display Port output with an HDMI adapter and that doesn't work.
So my question is; does anyone know why the integrated graphics aren't working properly with some monitors with DVI inputs?
I'm inclined to believe this could be an HDCP issue as I don't think my monitor is HDCP compliment but if that's the case I don't understand why HDCP is even being expected all the time. Other graphics cards don't expect this and my monitor works with every other system I've tried.
Asus Maximus VIII Ranger motherboard (Z170)
Intel i7 6700K
Viewsonic VG2039wm monitor
Belkin HDMI to DVI-D cable
I've been doing some more searching on this forum and it seems that lots of people have had problems with Intel integrated graphics not working properly with monitors and there doesn't seem to be a definitive fix.
I've tried disabling a re-enabling the the Intel graphics adapter in device manager and upon re-enabling it Windows 10 blue screens. Nice.
I've tried uninstalling the driver, uninstalling the adapter in device manager and deleting the driver and then re-booting Windows to allow Windows update to install the drivers. That didn't help.
I've also tried using the latest non-beta drivers from the Intel website. The current beta driver doesn't mention anything about including a fix for this issue so I doubt that will help.
Seen as I can't even get anything to display on the monitor before Widows boots, it seems to me that the Intel Integrated Graphics adapter is unable to detect my monitor despite me using equipment and cables that work perfectly with other graphics cards. So my opinion is that Intel's Integrated graphics solution doesn't comply with normal HDMI/DVI standards. The HDMI socket on my motherboard should be 100% compatible with any monitor equipped with a DVI-D input up to 1920x1200 using the good quality single link HDMI to DVI cable I have. It should also work first time, without any messing around!
I would like a response from Intel on this as at the moment I have a £1000 PC that's unusable without the use of an additional graphics card!
Also, I've noticed this product that may help the situation: EDID Emulator Adapter HDMI with Preset EDID | LINDY UK
However I don't see why I should pay £28 for additional equipment to fix this issue. Perhaps Intel would like to send me one? Or maybe an Nvidia graphics card so I can ditch the integrated graphics all together?
Some users have noted that HDMI and DVI outputs do not work, even in BIOS. We recommend using straight connection from the PC to the monitor to avoid any interference.
At this point you will need to contact your motherboard manufacturer, and tell them what is happening with your system.
I suggest you to push your respective OEMs to test each issue and particular hardware combination, as we believe it could be compatibility with certain Monitors. Please bear in mind that once the systems are integrated with our Graphics Controllers, it falls back to the OEMs to work on issues like this one.
Also, as some users talked about… HDCP works by disabling video when compliance fails, only for protected content. This does not apply for BIOS level.
Thanks, I'll try gettin in touch with Asus. They won't be able to test this properly though as they would need a monitor the same as mine and they're not sold anymore.
I am using a straight connection already. It's a purpose made HDMI to DVI cable. I can't use an HDMI to HDMI cable as my monitor doesn't have an HDMI input. I shouldn't need a monitor with an HDMI input anyway as computer HDMI outputs are supposed to be compatible with monitor DVI inputs. So the fact that this doesn't work proves that the HDMI output from the Intel HD 530 on my motherboard isn't properly compliant. Or at least it works differently to every other chipset I've used, and not in a good way.
Please try the latest driver for this graphics, you can download it here:
Ignore the yellow “A newer version of this software is available. Click here to get the latest version of this software.”
If this does not work, you most likely have an incompatible monitor with this system or the Belkin HDMI to DVI Video Cable that your using is not supported.
Hi Ivanu, I've just tried installing the driver you linked to and it hasn't helped.
It did install a link to the Intel HD Graphics Control Panel on my desktop though and when I double click on that I get this error:
This is starting to become quite ridiculous now. Can you please explain to me how a perfectly good monitor and cable that works perfectly well with lots of other systems somehow aren't compatible with the Intel graphics chipset? All normal monitors should be compatible with all normal graphics chipsets. I've used monitors with more desktops, laptops and servers than I can remember over the last 20+ years and this is the first time I've ever had a situation where one doesn't work with the other. A quick search through this forum and elsewhere online shows many people having this same problem with the Intel graphics but not with other manufacturers. My monitor and cable are fine. It's the Intel graphics chipset that isn't compatible. Can Intel please fix this!
Try uninstalling the driver and then reinstall it, also run Windows update and make sure you have all the drivers installed from your system manufacturer. If this does not work, then uninstall the driver and install the graphics driver from your system manufacturer. If nothing like this work for your monitor, there is no other possible solution I can give you.