I recently bought a new LCD monitor. It has an HDMI input, but no DVI. I've used the included DVI-to-HDMI adapter to connect it to my video card. I've found that the ATI drivers by default apply a 8% underscan to HDMI outputs, which with the adapter is how the output is recognized. That results in both a blurry output and part of the screen unused.
While I know already that I can change this through Catalyst Control Center, the problem is that even when I use the same resolution and refresh rate for games, the screen resets and I have to alt-tab into windows, open CCC and set the underscan back to 0%.
Is there any registry setting that can be set to permanently have it default to a 0% underscan whenever it changes video modes?