DVI (Digital Visual Interface) is also a way of connecting a monitor to a graphics card. As its name indicates its a 100% digital interface, as opposed to good old analog VGA. It's mainly used for connecting flat panel displays but can also be used for traditional CRTs.
Digital you say?
Traditionally, the image exists in digital form inside your graphics card's memory. It then passes through the graphics card's RAMDAC, which transforms this into an analogue signal, which travels down the cable to your monitor. So far so good. But if you're lucky enough to have one of those snazzy LCD screens the signal then goes through an analog to digital converter before being used to command the pixels on your screen. As you can see that doesn't make much sense. Hence the idea of a pure digital link from video card to monitor, that would preserve the picture's integrity.
There are gains for CRTs too (though to a lesser extent). Analogue transmissions are error prone: simplifying grossly, if you send values ranging continuously between 0 and 1 down a wire and you receive 0.9823 you have no way of knowing whether the number sent was actually that or 0.98 or 0.99. On the other hand in a digital transmission there are only 2 clear cut possibilities. It's a perilous journey for an analogue signal travelling through dodgy connectors and cables but life is better for a digital signal.
The protocol used to transmit the data down the wires was invented by Silicon Image and is called transition minimised differential signalling (TDMS). The DVI specs say that at the very least there should be one "link" composed of three data channels (one for each colour component) and a clock channel. Each 8 bit colour component is encoded into 10 bits (the remaining 2 bits carry information on how the data is encoded). The interface can run at clock rates of up to 165 MHz, i.e. it can deliver 165 million pixels per second.
A second link is optional, but since it increases the amount of bandwidth, it can be necessary for higher resolutions and refresh rates. The 2 links share the same clock channel and must run at the same frequency.
The following table summarise what resolutions you'll be able to run with the 2 configurations:
Display Single-link dual link
60 Hz LCD, 5% blanking rate Up to 1920 x 1080 (HDTV) Up to 2048 x 1536 (QXGA)
75 Hz CRT, 15% blanking rate Up to 1280 x 1024 (SXGA) Up to QXGA
85 Hz CRT, 15% blanking rate Up to SXGA Up to HDTV
Connectors and Cables
There are 2 kinds of DVI connectors. The first known as DVI-D has 24 pins, which are all used to transmit the digital signal. The pins are arrange in a rectangular 8 x 3 grid. The second, known has DVI-I has an extra 5 pins in a cross-hair arrangement to the right of the 24 pins. These pins carry an analog signal. The hope is that one day DVI will be the only interface needed, for digital and analog monitors. So far the vast majority of graphics cards use the DVI-I connector, as both DVI-D and DVI-I plugs can be connected to them (3 cheers to the Digital Display Working Group for this). In the case of a single link display, only 12 of the 24 pins are used.
DVI fully supports hot plugging of displays. DVI also supports the VESA Display Data Channel (DDC) and the Extended Display Identification Data (EDID) specifications, which basically allow the display to tell the graphics card it's connected to about its features and specifications. Another interesting feature, that is not used by current displays or cards, is the ability for the card to only send new data to the monitor when the individual pixels to be displayed change state.
Because DVI is a high quality digital interface, people like the MPAA are worried that it could be used to produce pirated copies of DVDs. All the effort put into copy protection at the level of the DVD drive and player are wasted if people are able to tap in to the high quality feed coming over the DVI link. Thus was born High-bandwidth Digital Content Protection (HDCP). Not implemented on current devices, this system would encrypt pixel data and refuse to deliver data to devices designated as compromised. How this would work in practise I'm not sure, but it sure sounds like another case of the movie industry throwing its weight around. Final publication of the specification is scheduled for January 2003.
The DVI specification was designed by the Digital Display Working Group (DDWG). The DDWG is an open industry group lead by Intel, Compaq, Fujitsu, Hewlett-Packard, IBM, NEC and Silicon Image.
The initial specification was released in April 1999.