Smartphones are fueling the demand for ever-expanding connectivity. While this statement is by no means shocking (or shouldn’t be, at least) this creates a problem.
For many applications, the delay (latency) between the broadcaster sending out a video and the subscriber actually viewing that content can make for a negative user-experience.
There are a few different reasons why low-latency live video streaming is important. Live events need to be broadcast as seen as soon as possible to keep up with the excitement and avoid spoilers. Low latency is the only way to have truly interactive experiences with natural conversation. Above all of that, low latency is what your users want and the demand for it will continue to grow.
1) Live Events Should be Live
In order to stream a live event it must, quite obviously, be streamed live and in real-time. Participants need to see and hear what is happening as it happens.
Whether a concert or sports game, that electric feeling of experiencing something happening right now is infectious. When done right, the cameras moving through the event will make your viewers more than passive spectators. It will elevate them to participants, enveloping them in the events unfolding around them. However, everything needs to flow naturally in order for a fully immersive experience.
Of course, you’d like to share that excitement with your friends, family or followers. Twitter, texting, and the like, are instant. This is a double edged-sword in that is useful to have a conversation, but could also lead to unintended spoilers and the irritation that comes with that. Given our constant connectivity, we should be able to talk to other fans without latency induced spoilers.
There is no such thing as “kind of live”; it either is or it’s not.
With communication as fast as it is, latency matters more and more. It may not sound like much, but if you really think about it, a lot can happen in a second or two. With today’s constant demand for communication, interactivity is key. When it comes to trying to capture live events, every second (and partial second) counts.
Interactivity can be broken down into three categories:
2A - What You Hear
You know that Monday morning, pre-coffee haze? That’s what latency is like. It forces you to delay responding to real-time events because it will take you longer to process them. These sluggish reaction times decrease effective communication.
A perfect example of this is conference calls. You sit there waiting to interject without being rude. However, every time the current person speaking finishes a thought, they have an extra moment to think about what they will say next as you wait to hear what they had to say in the first place.
In another example of disruptive latency, on the news when an anchor switches over to a field reporter or guest speaker, there can be an awkward pause as the guest waits to receive the signal. Of course, even when the conversation gets going the news anchor has to awkwardly cut in order to add anything making for a choppy, back and forth exchange.
Any artificially created delay makes for unnatural conversation and a negative user experience.
2B - What You See
Audio isn’t the only thing delayed. As most applications are sending video and audio at the same time, they are both prone to latency.
For example, you could be controlling your drone from a smartphone. It stands to reason that when something is in flight it's fairly important that you know exactly where it is. Streaming from an attached IP camera is great, but if you are navigating it based on the delayed footage streaming out, what you see on your mobile device is what the drone saw seconds ago. You could even be participating in a drone race, streaming the footage out to thousands of spectators. It may appear that the drone is one place but it's actually someplace else- like crashing into a wall.
Or perhaps a famous celebrity wants to broadcast a message to fans. The more people involved, the more confusing a delay can be as questions or comments submitted through text arrive out of context and won’t be responded to in time.
Same goes for video game streaming, the streamer wants to respond to comments as they come in, not seconds later when they are no longer relevant to what’s happening on the screen.
Delay in what you see and hear will affect your reaction time which brings us to our next point.
2C - How Fast You Respond
It’s not hard to imagine the problems that would arise if you took a second or two to respond to everything you saw or heard. It would certainly make driving… challenging to say the least.
Perhaps you are at a live auction or looking to expand your existing auction house to an online audience. The only way for this to be effective is to ensure that the bidders in the physical auction house won’t have an advantage over online bidders. The straightforward solution to this is to ensure the lowest latency possible.
Furthermore, when it comes to emergencies and public safety, quick response time is very important. Whether using drones to monitor fire suppression or police using body cameras, there is little tolerance for lag and falling out of sync when collaborating on rescue missions or conducting exercises.
For example, take the work that Novetta does for the Department of Defense, Federal Law Enforcement, and the Intelligence community. They provide a command center for intelligence gathering and situational control. Through integrated chat services, event/report sharing, and remote sensor control, hundreds of simultaneous operators can work together to quickly understand what is happening around them in their theatres of operation.
Needless to say, low latency is pretty important.
3) Audiences Want Low-Latency
Live video is urgent and exciting. It reaches out, grabs the viewer’s attention and (with effective interactivity) keeps them on your platform. According to Tubular Insights:
“Viewers spend 8X longer with live video than on-demand: 5.1 minutes for on-demand vs. 42.8 minutes for live video content.”
For example, online games such as the very popular HQ Trivia. By reworking a tried and true gameshow format, the incorporation of social elements created a product that drew a very large audience. However, it only became popular because it worked well. Multiplayer interactions could not happen without low latency.
As the market continues to grow and innovate, the technology used to build live-streaming applications must accommodate new uses and ever-rising client or consumer expectations. Real-time latency is the only way to create truly interactive experiences. As competitors inevitably adopt real-time live streaming, it only takes a second to lose customers.
How Can you Get Real-Time Latency?
The simple answer is to use Red5 Pro, it if you want to dig deep and discover how we accomplish low latency at scale, then check out our post on five ways to reduce latency in streams.
By integrating WebRTC in a proprietary solution, we have brought our latency down to 500 milliseconds or less. That will ensure that your broadcasters and subscribers are seeing the same thing at nearly the same exact moment to allow for the best experience your users can achieve.
Not only is what we’ve created fast, but it’s also completely customizable so that it can be integrated into an existing application.