Viewpoint: Making AI Connections

Joseph D. Cornwall
(Image credit: Future)

AI Story Logo

(Image credit: Future)

It may seem quaint from our perspective in 2024, when Spotify alone processes more than 1 billion song streams per day, but in 1983 less than 1 million compact discs were sold in the United States. That number quintupled in 1984—and by 1985, annual CD sales doubled again, to more than 10 million. “Digital” was a word on everyone’s lips; it was the future.

In 1986, I was fresh out of the service, a student of electrical engineering, and had landed my first AV job selling televisions and stereo equipment for a regional department store chain. “Digital ready” was a phrase plastered on nearly everything imaginable in those days. There were “digital ready” headphones, speakers, and amplifiers—all of which were completely analog in design. I remember thinking about how technically incorrect, and even misleading, that marketing speak was. It was my introduction to AV buzzwords.

[Editorial: 'Intelligence' Can Be Artificially Overrated]

Back then, I was (and remain to this day) an enthusiastic fan of the TV show Connections. Host James Burke, a British science historian, would start out with an interesting fact from long ago, such as how a Roman watermill and the bubonic plague led to waterpower during the Industrial Revolution. He would then draw a line connecting that to the Gutenberg printing press, the Jacquard loom, and ultimately the punch cards used in early computing. To me this was revelatory, illustrating clearly how the progress of civilization is an interconnected web of unrelated events that nevertheless build upon each other to create amazing new futures.

What does this have to do with AI? I hope you’ll allow me to make the connection (pun intended).

A Brief History

Once upon a time, in the days of black-and-white television, a computer scientist named Arthur Samuel developed a program to play checkers. This led, a few years later, to mathematician John McCarthy holding a workshop at Dartmouth College on “artificial intelligence,” the very first use of the term.

We, the TV generation, have grown up with the concept of AI and smart machines embedded in the entertainment fabric of our culture. The Six Million Dollar Man took us to The Terminator, who in turn took us to Max Headroom.

Dragon Dictate was a computer program that pioneered speech recognition in the 80s and early 90s. James Baker, founder of Dragon Systems, saw the problem of speech recognition as a pattern recognition problem. The power of AI is built on pattern recognition and probabilistic outcomes.

Not too much later, in the early 2000s, DARPA produced intelligent personal assistants. This was long before the words Siri, Alexa, or Hey Google were uttered in kitchens and living rooms across the country. The PDA fused with telephony, invited a camera to the party, and we suddenly needed predictive, shortcut control over powerful pocket-sized supercomputers.

AI Illustration

(Image credit: Getty Images)

It’s fair to say the AI has been with us for a long time, much longer even than the compact disc. It’s not news that societal interest in all things AI recently increased. According to Google, Internet searches for the term “artificial intelligence” or “AI” have increased twentyfold in just the last few years.

[Integrators and AI]

We are bombarded, almost daily, by predictions simultaneously proclaiming that the end of the world will be brought about by aberrant, self-aware computer programs—and those same smart machines will somehow save us from all the things from which we need to be saved. AI has become a pervasive meme, a chameleon that can refer to whatever the marketing department determines will best sell. It's become the “digital-ready" of today.

Where does that leave us?

It's incredibly important for AV professionals to become conversant in the language of AI and its implementation across products and services. It's incumbent upon us to help our clients, colleagues, and customers understand what is really meant by a claim of something having a processor “with AI” and how that technology affects the project.

Model Behavior

Reactive AI is far and away the most common implementation encountered. Reactive machines make decisions based solely on the current input without any memory or consideration of past experiences. Reactive machines are found in basic game-playing algorithms or real-time control systems where immediate response to input is crucial. There are numerous algorithms and approaches within the realm of reactive machines, each tailored to different tasks and contexts.

[AI in Pro AV: Here and Now]

Limited memory machines incorporate a small memory component to store past experiences or observations. This allows them to make slightly more complex decisions by considering recent history alongside current input. They are commonly used in applications where a modest level of context awareness is beneficial. The recommended movie list from your Netflix account or the track recommendations from your Pandora subscription are some examples. They use algorithms that primarily focus on recent use history and profiles to provide relevant recommendations.

It's incredibly important for AV professionals to become conversant in the language of AI and its implementation across products and services.

AV professionals should also be aware of AI learning models. Supervised and unsupervised learning models have very different applications and even greater differences in implementation. Supervised models are trained on very specific data, they plot a kind of map between input and output. An unsupervised learning model learns patterns and structures from input data without explicit supervision or labeled output, aiming to discover hidden relationships or groupings within the data itself. They make connections.

At the pinnacle of AI today we find generative, unsupervised models like OpenAI's ChatGPT. Other examples you may have heard of include DeepDream, which creates surreal and artistic images from existing ones, StyleGAN, which is used to create avatars or synthetic images for research, and DALL-E, which generates images from textual descriptions. We may soon see breakthroughs in text to video or image to video synthesis, too.

AI is a tool in a toolbox that dates back to the very beginnings of the Third Industrial Revolution. It’s a powerful resource that’s intimately intertwined with Moore’s law, predates VCRs, and opens avenues for increased efficiency and unique creativity. AI will impact the path of social evolution in much the same way as the electrical grid did 140 years ago.

[SCN Hybrid World: Expert Insights and 16 Speakerphones to Know]

We can expect to make new scientific and social connections that will usher in workforce transformation and healthcare advancements, as well as address environmental challenges through climate modeling. It will even advance how people communicate, form relationships, and consume media. I believe the real impact of AI will be its ability to help us to make new connections and imagine the future even faster.

AV Technology Evangelist, Legrand

Technology evangelist Joseph D. Cornwall has been part of the AVIXA faculty since 2010, received the 2014 InfoComm Educator of the Year Award, and was named a member of the SCN Hall of Fame in 2024. His current qualifications and certifications include InfoComm CTS, CTS-D and CTS-I, Imaging Science Foundation ISF-C, ETA Fiber Optic Installer FOI, LEED Green Associate, and DSCE certification. He's created dozens of training programs, nearly all of which have been certified by InfoComm, BICSI, NSCA, and AIA for continuing education credits.