0

I recently bought a new LCD monitor. It has an HDMI input, but no DVI. I've used the included DVI-to-HDMI adapter to connect it to my video card. I've found that the ATI drivers by default apply a 8% underscan to HDMI outputs, which with the adapter is how the output is recognized. That results in both a blurry output and part of the screen unused.

While I know already that I can change this through Catalyst Control Center, the problem is that even when I use the same resolution and refresh rate for games, the screen resets and I have to alt-tab into windows, open CCC and set the underscan back to 0%.

Is there any registry setting that can be set to permanently have it default to a 0% underscan whenever it changes video modes?

fixer1234
  • 27,064
  • 61
  • 75
  • 116
nedruod
  • 101
  • 3
  • possible duplicate of [Change Overscan/Underscan settings without Catalyst Control Center](http://superuser.com/questions/458321/change-overscan-underscan-settings-without-catalyst-control-center) – Doktoro Reichard Jul 25 '14 at 21:55

2 Answers2

0

I had this exact problem, but overscan instead of underscan - change the dvi to hdmi adapter, as this was the cause of my problem. (I bought a belkin adapter and this works for me now.)

0

I know this question was posted a long time ago, but maybe someone will find my answer useful anyway! ;-) I found this solution elsewhere in this forum, and it worked great on my Win 7 PC with an AMD HD7750 Graphic card:

Go to the following key in the registry:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Contro l\video{####....}\0000

Create a new DWORD:

"DigitalHDTVDefaultUnderscan" = dword 0x0000


Note: there might be several {####....} , should be the one with most of the 
ATI settings.

I had the problem when connecting my PC to my Panasonic TV, but I think this happens with AMD/ATI graphics cards when connected via HDMI weather it be a TV or an LCD screen...

Jippi70
  • 1
  • 2