Skip to main content

How Do We Make Video Over IP Work With Less Latency?

Achieving low latency video streaming and broadcasting is something that content creators and organizations are after in full force. There are plenty of situations where a low latency stream is desirable, even necessary for an application. Content creators, like those that populate Youtube, Twitch and the like, want their audience to get the live stream with little delay, as this facilitates communication between host and audience. This same dynamic is essential for businesses, which may rely on video streaming for training purposes or for making high priority announcements to company staff. Then there’s sports, where live broadcasting of an event must be as efficient as possible, or viewers will read what’s going on during a game long before they see it on screen. That’s a problem. And there are applications where low latency is not just wanted, but needed. The military, for example, utilizes remote video to make life and death decisions. A delay here may come at a human cost. Precise manufacturing processes, like those found in the automotive industry, also use video with their automated systems to maximize production. The lower the video latency, the greater the production. Low latency, then, is the goal for a huge variety of clients, but unfortunately, it’s not a simple problem to solve. With so many steps between image capture and display, delays can creep in at any point in the process. There are general measures available that will reliably cut out some of the latency, and this is often enough, but it’s usually best to have a video delivery specialist, like an AV integrator, assess and improve a company’s video production and transmission capabilities.

Where does latency come from?

From camera to screen, there are several stages involved in video processing. Each one adds a bit of latency, though some of these steps produce negligible amounts of lag. These steps, like network processing and, usually, video compression and decompression, can be tweaked to an extent, but they are rarely the focus of a low latency mission. As long as a modern compression standard is chosen, like H.264, then video compression and decompression won’t usually pose much of a problem. But what does pose a problem is how the video data is buffered and segmented. This, fortunately, is something that companies have control over. How the network buffers data and how it is segmented and signaled will have major implications on how quickly that video stream arrives and in what condition it arrives in.

Reducing lag and drag

As with so many things, optimizing video latency and quality poses some tradeoffs. It’s difficult to consistently achieve both extremely high-quality video and extremely low latency values. It is certainly possible to do so, as long as a business has plenty of bandwidth to spare, and is willing to dedicate that bandwidth to video content delivery. Realistically, though, most will need to find a balance here. This is an important note: Achieving low latency video streaming is difficult to do without expert guidance. In a single video delivery solution, there is a lot to think about. How the network is configured, how much bandwidth is available, what codec is being used, what bitrate options are available, what transmission protocol has been selected, data segmenting and buffer settings (and there’s more than one buffer to consider) all have an impact on the final result. It’s good practice to think of these elements in concert, as a single solution that is optimized toward a single goal. Changing just one part of the process may achieve lower latency, but the fix might not be reliable or may produce too many externalities. That said, here are few standard approaches to reducing latency:

1. Reduce the buffer’s read-ahead time boundary – Buffers serve an essential purpose for a video stream, even though they are a source of dread for anyone bingeing on Youtube or Netflix. A buffer stabilizes the stream by providing a temporal storage element for any data that is delayed during transmission. The internet is a tangled mass of communication routes, and when video data is sent to a delivery service, each piece of it may be channeled along different routes. This inevitably results in some video data arriving sooner than other pieces of data. Without a buffer, this will cause major drops in quality and stream fluidity, a result known as “jitter.” A jitter buffer gives that slow data a little extra time to catch up. However, the jitter buffer produces its own latency. If it is allowed to wait too long for data, it may sacrifice frame rates for marginal improvements in quality.

Reducing this time boundary will cut some latency out, but it does increase the risk of quality drops. Again, it’s a tradeoff. If a company can achieve excellent transmission speeds, though, the risk could be very small.

2. Configure HTTP adaptive streaming – Video streaming using HTTP protocol is naturally going to come with more lag than streaming with RTSP or RTMP protocols. However, it is expensive and difficult to scale up RTSP or RTMP so that it can reliably accommodate 100 or more viewers. If a company’s streaming needs fall below this threshold, then much lower latencies can be achieved with RTSP or RTMP. Realistically, though, HTTP is almost always going to be the recommendation.

Fortunately, there is a lot of work being done now to enhance HTTP streaming efficiency. Video content is delivered through HTTP in the form of segments, which are usually broken down into several seconds of video at once. Reducing the length of these segments will mitigate latency, as smaller pieces of data can be fired off rapidly. Segments between two and four seconds are usually the preferred length, as this preserves encoding efficiency while allowing the stream to respond to fluctuations in bandwidth.

3. Compromise on lower bitrates, when needed – Corporate video streams are often delivered to a variety of devices. This includes mobile devices, which have to rely on less robust data delivery methods to get their stream going. To ensure every viewer has a smooth streaming experience, it’s wise to adopt adaptive bitrate streaming, or ABR, methods.

With ABR, the stream is made available at several bitrates, so when needed, a viewer can opt for a lower quality image to preserve frame rates. In many cases, the drop in quality is minor compared to the benefit of high frame rates. To provide ABR, the streamer will need to stream several quality levels at once when transmitting to the video distribution service. Alternatively, the video distribution service may handle this on their end, as long as the stream provider produces a high-quality stream. Data can always be removed after creation, but not added to. Not every video distribution service does this, though, so make sure to check it.

These are just the basics, and an A/V integrator can go into much greater depth when assessing a client’s video streaming capabilities. The essential point is that low latency is possible, and when approached properly it can be secured without sacrificing quality.