During one of my "dog and pony shows" last week, a question floated my way concerning the differences between the multiple DVI connectors out in the world. In answering this question - and cognizant of the fact that my deadline for this article was a mere three days away -- I chose to make this month's Tech Tip a discussion of the realities of DVI.
Going Digital
The Digital Visual Interface, or DVI, was designed for the transmission of digital signals between computer graphics cards and display devices, such as LCD flat panels and projectors. The standard interface prior to DVI was VGA (or DB15/15 PIN), which was designed to work with analog displays such as CRT monitors. The DVI interface uses a protocol that addresses each pixel discretely. DVI's pixel-for-pixel transmission of data produces better image quality than the older analog VGA standard. In the older analog formats, an individual pixel could be affected by the information sent to adjacent pixels.
Standard DVI signals are limited to an 8-bit bandwidth, compared to the 10-bit bandwidth available on analog signals. However, DVI typically produces a more visually pleasing image because - apart from its pixel-for-pixel accuracy - it avoids interference from electric noise (the 60 cycle hum) and other analog distortions, such as ringing.
The DVI format supports both digital and analog signals on the same connector. This allows graphics cards to output either type of signal, depending on the type of monitor or display device detected (or not detected) on the other end of the cable. Typically, if the source does not recognize a digital display device, it will output an analog signal.
Making the Handshake
DVI also supports the Extended Display Identification Data (EDID) link between a computer graphics card and the display device. The EDID information provides a "handshake" between the source card and the display.
Many DVI graphics cards won't output video unless there is a DVI display device attached. For professional systems, most routers, switchers and projectors provide a means to send this EDID information back to the DVI source, and thus give the graphics card the handshake required to start sending digital video through the DVI cable. That EDID information will tell the source device, such as a PC, to output a digital video signal at a specific resolution. That specific resolution may be the native resolution of the display device or a pre-selected resolution from a switcher.
DVI also supports another handshake with display devices - HDCP. As discussed in an earlier Tech Tip, HDCP is the copyright protection utilized on Blu-Ray, PS3 and many other consumer products that support HD digital video outputs. If the source device is using either a DVI or HDMI connector, and the content you are playing is copy protected (as are movies and many games), your display device will also need to be HDCP compliant. If an HDCP compliant display device is not detected, the source device will not output video from the DVI or HDMI connector.
In all cases, when setting up a system, you will need to have the display device and any switchers and routers in line powered on so that the DVI graphics card can make the connection. You may have to set the EDID and, if needed, the HDCP information on the switcher manually. If so, turn off the computer until you make these settings, to ensure that the card boots up to the correct settings and outputs the correct signal resolution. Refer to your operator's manual on any DVI component should you have any questions on system settings.
Get the Connection
There are 3 basic connector types on a DVI cable: DVI-D, DVI-A and DVI-I. The DVI-D connector is the most prevalent. It supports only digital signals. The easy way to spot this cable is the lack of the four pins that surround the horizontal blade. The DVI-D connector is the digital input plug on most LCD flat panel displays.
The DVI-A connector is rarely used in the presentation A/V market. It supports only analog outputs from the DVI connector. In essence, it turns the DVI cable into a bigger VGA cable. The DVI-A cable has 4 pins surrounding the horizontal blade. These pins carry the analog video information (RGB and Horizintal Sync).
The DVI-I cable combines both digital and analog signals on the same cable. Most graphics cards use this connector. The "I" in DVI-I stands for "integrated." Like the DVI-A, the DVI-I connector also has 4 pins around the horizontal blade. When the DVI-I enabled card boots up, it will sense whether a DVI device is connected and then will output a digital signal. Otherwise, it will only output analog signals - a very common issue on MAC computers.
In addition to the digital, analog and integrated variants of DVI, there are two other possible configurations. Both the DVI-D and DVI-I connectors can support single-link or dual-link signals. The dual link - commonly referred to as DVI-DL - is used for high resolution displays, such as those beyond 1600x1200 or 1920x1080. The second link adds bandwidth to allow for the higher resolution sources. The dual-link cable has all the pins present. The single-link cable will not have the six pins in the center of the connector.
We should also mention here the DVI-to-VGA (DB15/15 PIN) adapters. These adapters are either simple plugs or cables, and are designed to work with DVI-I outputs from computer graphics cards. All the adapter is doing is converting the analog video information found on the 4 pins around the horizontal blade, and the vertical sync information from the DVI connector format, to the VGA connector format. If the analog pins are not active on the DVI connector, the adapter plug is worthless. Many DVI devices use the DVI-I dual-link connector because any of the DVI connector options will plug into it. Conversely, the DVI-D and DVI-A type connector plugs will only accept a like type of connector on the cable - they are not "universal."
Going the Distance
One of the limiting factors on DVI cable use is the distance the signal will travel before it becomes unusable. On a standard copper cable, the distance you can send the signal is limited to about 15 feet. You may be able to send it further if the resolution of the signal is 1280x1024 or lower. As the signal resolution increases - say HD at 1920x1080 - the bandwidth required to run it also increases and the distance you can send the signal decreases.
For longer distances, there are high performance copper cables that will send the signal up to about 50 feet with high resolution sources, as well as fiber optic cables that can run DVI sources up to 300 feet. As with any other technology, there are new DVI cable solutions every month; check with the usual suspects for the latest and greatest in DVI cable runs.
If we work with the concept that we are limited to 15 feet on standard cables, we also need to look at the total signal path. If you are running through a distribution splitter, or router, you will want to note whether the DVI signals are "re-clocked." This re-clocking signifies that the DVI signal comes into the switcher and is sent out as a refreshed signal, with another 15 feet of distance available to reach the display device or next signal booster in the line. If the signal is not re-clocked, you must keep the entire signal path to a maximum of 15 feet.
Change for the Sake of Change
Even as I write these words, there are a number of new connector formats coming to market. For example, we in the PRO A/V market are starting to deal with HDMI. And the next new format after that looks to be Display Port, which the graphics card manufacturers seem to favor.