I'm facing an issue with a Dell PC running Vista, using an Intel 965 Express chipset. This PC has a DVI output interface. I connected it to a TV flat screen having only a VGA input interface. So between both, there is a VGA-to-DVI cable.
If I run the PC in debug mode, I have a display (no driver loaded)
As soon as Vista is loading the drivers, it detects the screen, and I don't know what kind of resolution it detects but I lose the dispay and got an error on the TV :Signal out of range"
I've tested the PC on a TFT screen, works like a charm using a full dvi-dvi link.
My question is : is it possible to disable the automatic detection and enforce the DVI output of the PC to keep a resolution of 1024*768 per example ??