Back in 2004, I contributed an essay to Constance Ledoux Book's Digital Television: DTV and the Consumer, a book detailing the dawn of digital television. I wrote about discovering HDTV sets in my local Walmart Supercenter, declaring that HDTV had gone mainstream. Just seven years earlier, in 1997, the Federal Communications Commission (FCC) had released its Fifth Report and Order, which outlined an aggressive strategy for broadcasters to transition to DTV.
[Viewpoint: What Happened to the AV Department?]
While HDTV should have been the new normal, at least in my eyes, sales were still catching up. The Consumer Electronics Association (CEA) estimated that one out of every four televisions sold in 2004 was an HDTV. That percentage sounds low today, but back then it was huge. The $1,600 average price tag for an HDTV set didn't help sales, but the price had dropped from $2,400 in 2003, according to the Leichtman Research Group.
Another stumbling block: There wasn't exactly an immediate windfall of HD programming. For example, NBC Nightly News was the first national evening news to broadcast in HD—in 2007—while The Simpsons didn't air its first HD episode until 2009, its 20th season. The first Super Bowl broadcast in HD was in 2000 (Super Bowl XXXIV), but Olympics fans had to wait until Beijing in 2008 to experience the entire event in HD.
These days, you can still find new Full HD consumer sets, but choices are limited (and the displays are relatively small). Instead, 4K has taken over the consumer marketplace. Without even taking Black Friday pricing into account, you can get a 55-inch LG 4K smart TV for less than $350 at Walmart. Other brands are even less expensive for larger sets. There are premium models for premium prices, too, but in 2024, you don't need to spend a fortune for 4K.
Maybe the speed of technological advancement is making us all lose a little perspective, but none of this is old. In a couple of years, the first talking picture, The Jazz Singer, will celebrate its centennial. Now that's old—we're talking black-and-white film stock, as well as a record on a turntable connected to a projector to sync the video and audio.
But HDTV? WRAL-TV in Raleigh, NC, delivered the first U.S. public HDTV broadcast in 1996, while Hawaii's KITV was the first U.S. station to begin commercial digital broadcasts in 1998. And it literally took until 2014 for the networks to make the complete transition to HD. Big Brother was the last holdout in prime time, due to its extensive camera (and related infrastructure) requirements.
Of course, by the time prime time was all HD, Netflix had launched streaming content in 4K. A few years later, the FCC approved NextGen TV (also known as ATSC 3.0), which opened the door to over-the-air 4K, interactive features, and much more. However, the transition to NextGen TV is voluntary, at least so far, so not every broadcaster has made the move.
Today, almost anyone with a halfway decent internet connection can stream tons of 4K content at home. Just a decade ago, it was House of Cards and a couple of nature documentaries. And a decade before that, there was only a handful of HD content, which you could only watch in high definition on a very expensive TV.
[Editorial: 'Intelligence' Can Be Artificially Overrated]
As we all welcome 2025, it's interesting to think about what current Pro AV technologies will look downright antiquated in a decade. Will we laugh at how little AI was being used? Will HD be a distant memory? Heck, will we be in the middle of an 8K transition? Ten years is not that far down the road—but in terms of technology, what our industry relies on this year might just be considered ancient history.