Understanding HDMI Connections in Commercial AV
Having been in the commercial audiovisual industry for nearly 30 years, we have witnessed the birth, and subsequent demise, of numerous trends in technology. But one trend that emerged in the early to middle ’90s is still with us today: the concept of a “commercial”-type of device, versus a “consumer”-type of device. This has led to a schism in the product lines from any manufacturer that addresses both the traditional consumer (i.e., home theater) marketplace and that of professional or commercial audiovisual. Even though the displays may seem somewhat similar on the surface, there is still a sincere difference between them, which is apparent if you investigate beyond the obvious cosmetic disparity into the construction, features, and ultimately the purpose of the device. Nowhere is this noticed more than on display devices such as flat panels or projectors. And the introduction of the HDMI standard into the consumer AV world, and the migration of that standard now into some parts of the commercial AV world, is a case in point in convergence.
In the early days of commercial audiovisual, when the very concept of the market’s existence was new, there was little difference in the products a commercial systems integrator would sell, versus that which a home theater integrator could provide. The reality was that the products themselves were so new, the technologies so early in the product cycle, that even though televisions were hardly new and audio equipment such as speakers and amplifiers were certainly not new, the uses they were being put to were so novel that the market was relatively tiny. This meant any manufacturer would have been happy to sell each unit, regardless of where it was going. The features were irrelevant!
This changed very rapidly, as the market for commercial product matured and the home theater (often called residential or generally referred to as consumer) market grew to the point where product needed to diverge into two disparate paths. Consumer manufacturing was focused on inexpensive manufacturing, providing large runs of product to bring the selling cost down, and to provide the features needed in the average living room. Commercial applications, on the other hand, had evolved, and manufacturers began to understand the need for a different grade of product, made to withstand the rigors of the growing commercial market, and its truly different environment.
In the very early days of the ’90s, when this divergence in product began, the big differentiator was simply did the display device feature a tuner and sold as a “television,” or did it lack the tuner and was classified as a “monitor.” Otherwise, products produced were largely the same. However, the radically different environment, and the demands placed on displays, caused an evolution in features, separating the consumer and commercial product lines by a much larger gap. The first realization was that a display designed for use in the home was intended to be on 4-6 hours per day, with long periods of being off, in between the times it was on. Cooling, cabinet design, and power supplies were built around this concept. However, in most commercial applications, this would not be the case. Displays could be on 8, 10, or even 24 hours straight. They could be turned off and on rapidly, with little to no rest time in between periods of use. This brought about a change in cooling design, and a redesign of the power supply. This is even more pronounced on modern flat panels, where the entire cabinet will be designed for maximum durability and use 24/7, rather than based around aesthetics.
Commercial display on-screen menus were optimized for features not commonly needed in the home, such as display calibration. As serial control protocols evolved, these were typically added to the commercial-grade panels and not consumer (outside of high-end home theater enthusiast product) to facilitate integration into control systems in conference rooms. Warranties were also adjusted, with far more coverage offered on commercial product, due to its more rigorous use (typically 1+ years on site, versus 90 days–1-year over the counter for consumer). Video inputs were still very frequently the same, with composite video (and then the new, high resolution S-Video) being available on both types of product, with standard RCA connectors appearing on consumer-grade product, and locking, broadcast-type BNC connectors on commercial-grade product. Often times, consumer-grade products would offer additional video inputs, just to accommodate more sources (commercial product not needing this, due to the assumption that an outboard switching device would be utilized). Audio was also often a big differentiator, with consumer TVs having onboard speakers, and early commercial product lacking them (again, with the assumption this could be added outboard, if required). Ultimately this was what the early days of commercial product appeared to be settled on… until the computer revolution occurred.
Enter Computer Video
A new quandary was facing the display device manufacturers: How do we address computer video? In the earliest days of home computers, the computer would often have an RF or composite video output to connect to a home television. Early computer monitors were just small, single color CRTs, so this was a radical improvement for the early adopters who brought home the new computer technology. But as computer technology evolved, and its graphical demands grew, video technology evolved along with it.
This came to a head in 1987, when IBM introduced its P/S2 computer, with the new VGA connector for computer video. VGA was an adaptation of component video — that is the breaking down of the parent video signal into separate color channels for red, green, blue, and both horizontal and vertical sync. This had already been introduced in broadcast and commercial video as an improved way of transmitting analog video data without compositing the signal, and thus improving image quality. This was done using five separate signal wires, terminated in BNC connectors. IBM adapted this for use on computers by incorporating it, along with monitor information channels, into what is now the ubiquitous 15-pin HD “d-sub” connector we have come to associate as synonymous with analog computer video, or VGA.
This meant, while electrically it was compatible with the component video, VGA was distinctly different in resolution and refresh rate. This required a display device to be designed to handle the signal requirements. And as the commercial use of PCs evolved and multiplied, the demand went out for display devices (beyond computer monitors, such as large screen monitors and projectors) that could handle those signals. The earliest way to do so was to use what is known as a “scan converter,” a device that could adapt the computer refresh rate and resolution to that of a standard component video signal, or even downgrade it to run over S-Video or composite video. This was merely a stopgap measure, and manufacturers began to incorporate something that now seems commonplace: a computer video input capable of handling the ever-increasing resolution demands of computer video. When consumer TV was still standard at 320x240, well before high definition was even a concept, computer resolutions were 640x480 or above.
A single HDMI connector can do the work a any number of legacy connectors of different formats.
This need for computer inputs added an additional difference between the inputs on a consumer versus a commercial display. Consumer panels normally had no computer video resolution support, whereas commercial panels would have the appropriate input and support for a large range of resolutions. This remained static for many years. In 1999, a new video standard for computers was introduced to take advantage of the digital nature of modern flat panels: DVI or Digital Video Interface. It sprung out of the understanding that VGA was inherently flawed. The video signal in the computer was produced as a digital signal, converted to analog, and then sent to the display, where it was converted back to digital. This digital-analog/analog-digital conversion resulted in a lower quality picture. But the advent of digital-only, fixed-matrix flat panels caused a re-evaluation in how video inside a PC was handled. The DVI standard was introduced by the Digital Display Working Group, based on a technology from Silicon Image, Inc. This completely new connector and signal would be digital only, and would allow for a much higher quality video signal than VGA. It was swiftly incorporated into early desktop flat panel monitors and projectors, but eventually migrated to all commercial displays, again because of the need for computer video acceptance on large displays.
But this led to a peculiar adoption of the technology by consumer video manufacturers. As the computer input requirements raged on the commercial side of the industry, the consumer side evolved at a slightly slower pace. The
5-wire component video on the commercial side was never really adopted by the consumer, but a 3-wire alternative, still in use today, YPbPr component, was adopted to allow for higher resolution, and ultimately HD, video. However, the manufacturers of consumer video sources speculated that an all-digital, high-quality video stream such as DVI could be used by DVD players, set top boxes, and similar source materials as a way to improve their quality as well. This brought about what was to be a very short time period where DVI was added to some consumer TVs and DVD players. This used the standard DVI connector, but the digital version of the standard TV video signals. It was not going to last, sadly, as the content producers objected to the consumer having a digital, highquality, but unsecure stream of video, potentially able to be used for piracy. So the content production industry pushed the concept of copy protection, known as HDCP (High Definition Copy Protection) whereby a source must verify and “handshake” with a chip inside the display to ensure that the DVI video was not used for illegal purposes. This led to major problems for early adopters of this type of consumer video, as different versions of the HDCP and DVI video standard were available, and not always compatible, and frequently, commercial-grade panels would have the DVI ports, but not the HDCP capability to be used with these sorts of video signals. This led to another evolution of digital, copy protected consumer video technology: HDMI.
HDMI Arrives
HDMI was originally developed by Hitachi, Matsushita Electric Industrial (Panasonic), Philips, Silicon Image, Sony, Thomson (RCA), and Toshiba. Digital Content Protection, LLC provides HDCP (which was developed by Intel) for HDMI. HDMI has the support of motion picture producers Fox, Universal, Warner Bros., and Disney, along with system operators DirecTV, EchoStar (Dish Network), and CableLabs. The HDMI founders began development on HDMI 1.0 on April 16, 2002, with the goal of creating a new type of AV connector that was backwardcompatible with the DVI being used on televisions at that time. The first draft of HDMI, HDMI 1.0 was designed to improve on DVI by using a smaller connector (similar to USB) and adding support for audio, enhanced support for color, and consumer electronics control functions. HDMI has evolved today into the current 1.4 standard, still using the same connector, with added features, improved resolution, and increased bandwidth.
According to In-Stat (a leading industry research firm), the number of HDMI devices sold was 5 million in 2004, 17.4 million in 2005, 63 million in 2006, and 143 million in 2007. It was in 2007 that HDMI was considered the de facto standard for HDTVs, and according to In-Stat, around 90 percent of digital televisions in 2007 included HDMI. In-Stat has estimated that 229 million HDMI devices were sold in 2008, and 394 million in 2009. On January 7, 2009, HDMI Licensing, LLC announced that HDMI had reached an installed base of over 600 million HDMI devices. In-Stat has estimated that all digital televisions by the end of 2010 would have at least one HDMI input. HDMI has had a meteoric rise to dominance in consumer video.
HDMI is capable of over 10 gigabits per second, outstripping even the most advanced analog, VGA, or DVI connection by far.
This doesn’t mean HDMI has no place in the commercial realm. Far from it, HDMI, being the de-facto standard for video sources such as Blu-Ray players and satellite boxes, would be required to be incorporated into commercial displays just to allow these sources to be used. It has the same benefits there as it does for consumers: easy installation, reduced cable count, high-quality video. However, there are other reasons it was initially adopted into commercial display devices. Resolution and bandwidth are immediate considerations. HDMI is capable of over 10 gigabits per second, outstripping even the most advanced analog, VGA, or DVI connection by far. It is only surpassed by the broadcastgrade uncompressed Serial Digital Interface (SDI) rarely seen outside broadcast-grade equipment. HDMI offers incredible resolution. The maximum pixel clock rate for HDMI 1.0 was 165 MHz, which was sufficient for supporting 1080p and WUXGA (1920×1200) at 60 Hz. HDMI 1.3 increased that to 340 MHz, which allows for higher resolution (such as WQXGA, 2560×1600) across a single digital link. HDMI also allows for a critical increase in color reproduction and accuracy, up to 48 bits. Well beyond anything not at the broadcast level, again. HDMI also offers built-in control channels, which, while useful for consumers, also add key benefits in the area of videoconferencing and digital signage, providing a way to automatically lip sync video and audio.
Also, one last major factor has driven the adoption of HDMI as a standard connector for commercial monitors: Once again, we have to consider the effect of the almighty computer. In today’s world, they are even more ubiquitous in the workplace as a collaboration and communication tool, and the computer video card manufacturers have embraced HDMI in a big way, giving display manufacturers who must support the computers of their customers little choice about integrating HDMI.
AMD, Dell, Intel Corporation, Lenovo, Samsung Electronics LCD Business, and LG Display have announced intentions to accelerate adoption of scalable and lower power digital interfaces such as HDMI into the PC. Intel and AMD expect that analog display outputs such as Video Graphics Array (VGA) would no longer be supported in their product lines by 2015. HDMI has increasingly been included in new PCs for easy connection to consumer electronics devices. HDMI allows for slimmer laptop designs and supports higher resolutions with deeper color than VGA — a technology that is more than 20 years old. Additionally, as laptops get smaller and their embedded flat panel resolutions increase for more immersive experiences, the power advantages, bi-directional communications, and design efficiency benefits of HDMI make it a superior choice. Intel plans to end support of VGA in 2015 in its PC client processors and chipsets.
“We live in a digital-rich world and display technology must keep up with the explosion of digital content,” adds George He, chief technology officer, Lenovo. “By transitioning to digital display technologies like HDMI, customers can not only enjoy a better computing experience, they get more of what’s important to them in a laptop: more mobility, simplified design with fewer connectors, and longer battery life.”
Leading display panel manufacturers such as Samsung Electronics LCD Business and LG Display also are in strong support of this transition. “Samsung Electronics LCD Business is already supporting this transition with embedded HDMI notebooks, which we have been shipping since March of this year,” says Seung-Hwan Moon, vice president of engineering, LCD Business, Samsung Electronics.
“LG Display is fully prepared for this future transition. We already have different sizes of LCD panels with HDMI out in the market to fulfill various needs of customers,“ offers Michael Kim, vice president of IT Product Planning Department at LG Display.
The strong value proposition of HDMI as a digital display interface for PC users, coupled with industry innovation around these interfaces, means that HDMI will be adopted as a standard for PCs, as well as video. This means that to adapt to the demands of the business user, who will be bringing HDMI-equipped displays into the office or into the field, the market will continue to push display manufacturers into using HDMI in place of other video connectors.
Alan Brawn (alan@BrawnConsulting.com) and Jonathan Brawn (jonathan@brawnconsulting.com) are principals of Brawn Consulting LLC, an audiovisual and IT consulting, educational development, and market intelligence firm with national exposure to major manufacturers, distributors, and integrators in the industry.
HDMI: Beyond CE
By Steve Venuti, President of HDMI Licensing
As the de facto standard for digital interconnect, the HDMI interface has become so pervasive in CE (consumer electronic) devices that the applications for the technology have grown far beyond the home theater stack. We have seen HDMI in mobile devices, cell phones, and even automobiles. We are also seeing the HDMI interconnect make its way into commercial applications, including the hospitality and medical industries, as well as broadly, into the digital signage market.
Why is HDMI technology so appealing for commercial applications?
- HDMI connectors are pervasive: The HDMI connector is on so many devices, and so many categories of devices, that standardizing on HDMI technology is the most cost-effective way to use off-the-shelf CE devices in many commercial applications.
- Support for advanced features: The HDMI specification is forward looking and continues to evolve in order to support new industry and application requirements. For example, long before 4K panels were readily available, the HDMI specification announced its support for 4K resolutions in 2009.
- CEC (Consumer Electronics Control): CEC is a set of commands (some mandatory and some optional) that allows for system-wide commands and control to travel across the HDMI interconnect. For the end user, this means that pressing the play button on a Blu-ray player will automatically send a set of commands to the AVR and HDTV alerting these devices that they need to turn on and letting the devices understand exactly what kind of content is being transmitted so that they can auto-configure to the optimal aspect ratio, resolution, sound format, etc. CEC has great value in commercial applications as well. (The hotel industry uses CEC to ensure that the in-room entertainment system has a common set of commands and UI so that customers can easily understand how to navigate the in-room system. Retailers are using CEC to save millions of dollars by having the system automatically turn down when not in use and only power up devices that are required to be powered up.)
Are there challenges using the HDMI interconnect for commercial applications? We hear how it is difficult to run an HDMI signal over long lengths. It is true that the HDMI standard passes uncompressed video signals, which at high resolutions means quick attenuation. Then again, the quality over the HDMI standard is pristine. Nonetheless, many commercial applications have the need for long runs and passive copper cables are hitting their limit when they reach 10 meters. Many licensees of the HDMI specification have solved this issue by manufacturing active cables that can boost the signal to longer lengths, or baluns (also called extenders) that convert the signal to run over CAT5/6, fiber, or other cabling options. Another issue is a locking connector: we have heard for years how a locking connector is required for most commercial applications. The specification does not address this issue, however, some of our licenses have solved this problem by designing locking connectors that are fully compliant with the specification and are backwards compatible with HDMI connectors.