DVI: An Interface for All Displays
 
A new standard offers the benefits of digital video while maintaining compatibility with analog monitors.
 
By Alfred Poor
February 18, 2000

Analog interfaces have become standard for desktop monitors, but the increasing popularity of LCD displays brings the need for a completely digital interface. Digital-only solutions cause problems as well, both in compatibility between graphics adapters and digital displays, and in their inability to support legacy analog displays. After a few false starts by different groups and consortiums, the Digital Display Working Group (DDWG) has created the Digital Visual Interface (DVI), which has received widespread acceptance. Chances are good that your next display adapter or monitor will support this new standard.

Digital interfaces for displays are not new; the original IBM PC monochrome and color adapters relied on them. The Color Graphics Adapter (CGA) used a parallel interface with two signals for each of the three colors: red, green, and blue. One signal turned the color on and off and the other controlled the intensity—normal or bright. This approach was limited to just 16 colors. If this approach were used for today's 24-bit color displays, you would need a thick cable containing more than 27 wires.

This limitation moved designers to adopt analog interfaces for displays, starting with VGA. This requires just nine wires: signal and ground for each of the three colors, the vertical and horizontal sync signals, and a main ground. Instead of the simple on/off signals of the digital interface, varying the voltages on each wire determines the brightness level for each color: A higher voltage produces a brighter color.

But analog interfaces have their limitations, too. The digital information in the computer must be translated into analog signals in order to be sent to the display. This requires extra circuitry; some of the information can get lost in the conversion and transmission process, which results in lost color bits along the way.

Digital monitors—such as desktop LCDs—require an analog signal to be converted back to digital form. This adds cost to the monitor, and additional image data can be lost in the process. It is particularly difficult to translate analog data to control specific pixels on the screen; if the conversion of the sync signal is not perfect, you can get annoying pixel-jitter defects as interpolation causes the image to shift back and forth.

By contrast, a digital interface can take the digital data directly from the computer and display it pixel-for-pixel on a digital display. The result is a rock-steady image, and the screen receives every bit of the original color information.

But the problem remains: How do you get a digital interface without using dozens of wires? The answer is to switch from the parallel design of CGA and move to a high-speed serial design. The same technology advances that have made it possible to create fast serial connections such as USB and FireWire can be applied to a serial display interface.

Laptop owners may not even realize that they're already using digital interfaces. Notebook manufacturers recognized long ago that eliminating the digital-to-analog and analog-to-digital conversion steps meant reduction in component count, which in turn reduced costs. The problem is that the designs used internally in laptops don't easily make the transition to the desktop's external monitor.