866.CALL.DPI (225.5374)
Contact DPI via Email

The Importance Of Audio Visual Innovations

The history of the audio and video industry is one of innovation. Centuries of invention and genius have provided modern society with an astounding capacity for communication. And much of today’s A/V technology would probably seem like magic to someone beamed in from the 20th century. The march of innovation is still a driving force behind the A/V industry, and it’s truly incredible how many improvements, large and small, are behind every landmark piece of A/V technology. The history of those improvements and innovations is too extensive to summarize in a single go, but there are several developments that deserve attention.

What are some of the most important audio and visual innovations?

Getting to the current state of A/V technology took hundreds and hundreds of small steps, but there are several pieces of A/V equipment and several A/V solutions that form the bedrock of current communication and collaboration options. For practical purposes, they can be split up into audio and video input and output devices, and current high-level A/V solutions.

  1. Video inputs – Video input devices, largely consisting of cameras and connectors, trace their lineage back to well before the 19th century. For nearly two hundred years, inventors improved upon the camera design, improving photographic mediums in ways that added to their versatility, portability and effectiveness. The first successful photograph was believed to be created in 1816, using a silver chloride-coated piece of paper as the medium. It wouldn’t be until nearly a century later, in 1888, when George Eastman pioneered the use of celluloid film and a camera he named the “Kodak.” From there, it would only be a few decades before movie producers adopted the 35mm standard and reflexive lens technology greatly added to a photographer’s arsenal.

    These innovations were essential to the production of video cameras, which are still utilized extensively by the A/V industry. The first wave of color video cameras established themselves during the 1950s, and mostly in television studios. At first, these video cameras were extremely bulky and hard to handle, but steady improvements to the camera tubes and eventually a shift to solid state engineering made it possible to ferry the cameras anywhere. This was a critical development for the evolution of modern news casting.

    Camcorders arrived in the 1970s and sparked a struggle between Betamax and VHS formatting. The battle would last about a decade before VHS took over the market, though obviously its victory would be a very temporal one.

    Digital video cameras asserted themselves in the late 1980s, but a lack of effective video compression wouldn’t make them practical until the mid-1990s, when Ampex released their digital camera, the DCT. The DCT could store hours of video data on a single tape and this sparked a wave of innovation by Sony and others. Sony’s efforts in the industry culminated with the creation of high definition video recording and tapeless digital recording. Now, ultra HD resolutions are achievable in camera form factors that are as small as ever, and 3D video recording is even possible, though the market has not entirely warmed up to it just yet. In the near future, paradigm-shifting innovations like integration with augmented and virtual reality seem inevitable, and they promise to bring upheaval to the way people collaborate in digital spaces.

    The history of video connectors isn’t quite as storied, but it does reach back to the 1950s, as well, with the introduction of composite video. It had a dominating presence in commercial and consumer electronics through the 1990s, but has quickly been replaced by superior connectors since. Even VGA, which was introduced in 1987, is losing its presence in the electronics market. HDMI is the current standard and is capable of outputting ultra-HD signals, but other options already exist, like DisplayPort and MHL.

  2. Video outputs – The march of display technology could justify an entire history itself, as people have always been visually focused first. It’s no surprise, then, that so many luminaries and so many engineering firms have contributed to the current state of video display equipment.

    Although mankind has developed numerous options for presenting information and narratives throughout history, the best starting point for modern video display technology is likely the cathode ray tube, or CRT. The CRT was first presented in 1897, and would first be utilized commercially 25 years later. At first, CRT displays were extremely limited and could only display monochromatic images, but full color displays were available starting in the 1950s. CRTs are characterized by their generous bulk and their propensity for phosphor burn out, which leave permanent marks on the screen.

    Obvious improvements were there to be made, then, and a collection of inventors and firms jumped at the opportunity, including teams with Westinghouse and Sharp Corporation. Their efforts paved the way for liquid crystal display, or LCD, technology. Although CRT displays held their grip on the television and display market for nearly 50 years, LCD products rendered them obsolete before long. LCD didn’t have any of the burn issues and were lighter and more reliable. LCD products are still plentiful and the technology will probably be around for a while yet, but it is losing ground to LED displays.

    LEDs really deserve their own section, as their invention and iteration mimics some of the great inventor races of the 18th and 19th centuries. Its initial creation, though, was an accident. In the early 1960s, the rush among engineering firms was producing better and better semiconductors. In 1961, two inventors, Gary Pittman and James Biard accidentally invented the infrared LED while testing a new semiconductor substrate. Only a year later, Nick Holonyack, with General Electric, produced the first visible LED – the classic red LED. Holonyack recognized the potential of LEDs immediately, famously predicting that they would replace other sources of lighting within 50 years. He may have been off just a bit with that timeframe, but not by much.

    From there, other engineering teams expanded the emission range of LEDs by testing a wider swath of semiconductor substrates. In 1972, George Craford produced a yellow LED. In 1979, Shuji Nakamura created the first bright blue LED. Over the years, refinement to semiconductor engineering meant brighter and brighter LEDs, and more subtle color grading. By the 2000s, the stage was set for LEDs to begin engulfing the market, and they did just that. In 2004, Sony introduced the first TV with LED backlighting, and less than a decade later in 2012, LG unveiled a marvel of a display, an OLED (organic LED) display that required no backlighting, meaning the display could be produced in wafer-thin form factors. The price was a bit prohibitive at launch, but there is still much room to evolve LED manufacturing processes, which means LED display costs will rapidly decrease while their performance rapidly improves.

    In fact, the state of the A/V industry is such that LED displays are a standard addition to nearly every solution.

  3. Audio inputs – Microphones are among the oldest pieces of A/V technology and one of the most important. The first person to refer to microphones as such was Sir Charles Wheatstone, a prominent scientist and inventor in his own right, who coined the term in 1827. It wasn’t until 1876, though, that the microphone was actually invented, by Emile Berliner. Just two years later, David Edward Hughes created the carbon microphone, and it is this model that made mass use of microphones possible. His version of the technology was perfected over several decades, and they can be found in many old time photos that feature early 20th century celebrities, musicians and newsreaders.

    The first condenser microphone was produced in 1916, and it remains an important fixture in studios and broadcast booths to this day. It was the ribbon microphone, produced in 1923, and the shotgun microphone, made in 1963, though, that would become the dominant audio input options. Both the ribbon and shotgun mics allowed for superior directionality, and because they are engineered with relatively simple methods, they are easy to build upon. Modern ribbon microphones, for example, are built with nanomaterials, and this offers sound quality far beyond anything that has come before.

  4. Audio outputs – There have been scores of audio output devices over the decades, but the very first one wasn’t truly an output device at all. The phonautograph, which looked something like a cement mixer attached to a spinning drum, would visually record sound in a similar fashion to a modern seismometer, but wouldn’t actually play it back.

    It wasn’t until 1877 that the first recognizable piece of audio output technology, the phonograph, was invented. It would later be known as the gramophone, a record player or a turntable. Magnate and empire builder Thomas Edison is the person credited with the best version of the technology, as his gramophone could play back recorded sound. Funny enough, the record wouldn’t be invented for another 20 years, and that’s when the gramophone really took off.

    In truth, the history of speakers has been one of evolution and not revolution. The central principle behind loudspeaker technology, the moving-coil principle, is still the most important concept in modern speakers, and was developed in 1924. The aesthetics and formats have changed, but for the most part, audio outputs function much like they always have, only with much more precision and quality. By the 1950s, acoustic technology had progressed to the point that stereo recording was possible, which made it feasible to record complex sounds to a fine degree.

    Modern audio systems are defined by their controllability, and are managed by a series of amplifiers and audio controllers. This is where most of the gains have been made in recent decades, as audio controllers allow for easy and precise management of several audio inputs and outputs at once.


These four components make up the bulk of every A/V solution, whether a company is looking for improved internal collaboration, better branding, or conferencing. Even the most cutting edge collaboration solutions, which usually employ interactive displays, broad connectivity and cutting edge software, rely on audio and video equipment to function properly.

Of course, it’s how the equipment is organized and configured that makes the difference, which is where A/V integrators come in. With hundreds of years’ worth of technology in their arsenals, integrators are the gatekeepers to a world of sound and video.