HDMI is a digital video interface that allows you to connect an HDTV or monitor to a computer. It is also used to connect a projector or TV to a television. VGA is an analog video interface that allows you to connect a TV or monitor to a computer.
With all the great hardware we have available these days, it seems we should be enjoying great quality viewing no matter what, but what if that is not the case? Today’s SuperUser Q&A post seeks to clear things up for a confused reader.
Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.
Photo courtesy of lge (Flickr).
The Question
SuperUser reader alkamid wants to know why there is a noticeable difference in quality between HDMI-DVI and VGA:
Why is there a difference in quality between the two?
The difference is difficult to capture with a camera, but see my attempt at it below. I tried playing with the brightness, contrast, and sharpness settings, but I cannot get the same image quality. The resolution is 1920*1080 with Ubuntu 14.04 as my operating system.
VGA:
HDMI:
Why is the quality different? Is it intrinsic to these standards? Could I have a faulty VGA cable or mDP-VGA adaptor?
The Answer
SuperUser contributors Mate Juhasz, youngwt, and Jarrod Christman have the answer for us. First up, Mate Juhasz:
Followed by the answer from youngwt:
Some further reading: HDMI vs. DisplayPort vs. DVI vs. VGA
With our final answer from Jarrod Christman:
The first has already been stated, VGA is analog, so it will need to go through an analog to digital conversion inside the monitor. This will theoretically degrade image quality.
Second, assuming that you are using Windows, there is a technique called ClearType (developed by Microsoft) which improves the appearance of text by manipulating the sub pixels of an LCD monitor. VGA was developed with CRT monitors in mind and the notion of a sub pixel is not the same. Because of the requirement for ClearType to use an LCD screen and the fact that the VGA standard does not tell the host the specifications of the display, ClearType would be disabled with a VGA connection.
I remember hearing about ClearType from one its creators on a podcast for This().Developers().Life() IIRC, but this Wikipedia article also supports my theory. Also HDMI is backward compatible with DVI and DVI supports Electronic Display Identification (EDID).
Have something to add to the explanation? Sound off in the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.
Clock and Phase
Then adjust the clock and phase of the monitor to get the best match and the sharpest picture. However, since it is analog, these adjustments may shift over time, and thus you should ideally just use a digital signal.