After having connected my new Samsung TFT monitor through HDMI cable to my PC the outer approx 20 pixel are cut resp out of the visible area on the screen
This is clearly an indicator for overscan issue.
I am using Win7 x64 with a core i5 onboard (CPU integrated Intel graphics in H67 chipset)..
Since the video output comes from Intel CPU it seems to be the culprit.
How EXACTLY can I fix this overscan?
Is this an option in Device Manager of Win7 of the Intel Graphic driver?
I have a similar issue with a Toshiba P740 laptop with intel HD 3000 graphics. When connected to a HP 2509m 25" monitor all edges are off the screen (overscan). When you right click on the desptop and select graphics properties/advanced mode, the intel properties incorrectly reports my monitor as a digital TV. If I deselect maintain display scaling and choose custom aspect ratio I can resize the borders to make it fit. This will also save the settings on reboots.
Now the main issue with this is the graphics are horrible and fuzzy/blurry etc.. I tried everything as far as drivers etc.. Finally chatted with an Intel tech on the chat support on this site and they knew instantly what I was talking about. Bascially they told me yes, we know, and that we are working on a updated driver, and that currently there is NO fix and NO ETA on when one would come if ever:(
What I ended up doing is connecting my Toshiba to monitor with old school VGA cable. And it looks perfect, very crisp with full 1080 resolution. The problem is that I cannot use the upconversion software for DVD's or play BD movies since this requires a protected HDMI path. Fortunatley, I mainly use my BD drive to write BD data discs. But I'm not happy being I have used this monitor via HDMI with other laptops with ATI and Nvidia graphics.
Being that this is a known issue effecting many monitors, and that it is a showstopper problem effecting protected DVD and BD HD sources, Intel needs to fix this ASAP. This is totally unacceptable.
I found a workaround to this issue for now. Basically, I took one of those DVI to HDMI converters and plugged the cable into the DVI port of my monitor instead of HDMI. So, I have an HDMI cable coming out of the laptop, then being converted to DVI on my monitor. The Intel HD no longer sees this as a digital TV and no longer runs at 59 HZ with the overscan issue.
For those of you who have the same problem and have an unused DVI port on your monitor, there are several ways to adapt this without spending much money. There are plenty of cables and converters to choose from if you look at buying one online. I cannot say this would work on any monitor but it did on mine. I have an HP 2709m which is a 27 inch LCD and have seen others referencing the same problem with an HP 2509m so I guess it would work on that one too. The one issue is if you only have one DVI port and need it, you will then need to convert DVI to HDMI to input that data. Unless you are using the same Intel HD graphics, it may work fine. Also, maybe this would also work without overscanning??? I don't know.
Anyway, I thought it was good to share this workaround.
You can fix this in the Intel drivers - look for the "Quantization" setting. It will default to "limited" which clips part of your screen area on some monitors (all mine included). Set it to either default or full instead as you prefer.
Sadly, the settings does NOT save on reboot... which sucks badly. But hey, it's Intel drivers =(