The Balancing Act of Stream Quality One question we get consistently from developers using Red5 Pro is: what are the best settings for my stream to create the best quality experience? While we wish we could just have one short answer to this question, it is complex, and ultimately becomes a balancing act. Chris Wendt… Continue reading Stream Settings for Live Broadcasts – Balancing Latency and Video Quality
The Balancing Act of Stream Quality
One question we get consistently from developers using Red5 Pro is: what are the best settings for my stream to create the best quality experience?
While we wish we could just have one short answer to this question, it is complex, and ultimately becomes a balancing act. Chris Wendt from Comcast did a great presentation this week at the WebRTCBoston Meetup
where he talked about One-to-Many Live Broadcasts with WebRTC. In the talk he showed a diagram of a triangle which he referred to as the "Triangle of Despair." The three corners were labeled Quality, Latency, Scale. The idea was that you really have to pick one side of the triangle or another–not all three.
With Red5 Pro’s cloud-based clustering solution we’ve essentially flattened the triangle making it so you, the developer, only need to deal with these two factors. What this comes down to ultimately is a balance between the quality of the video versus the speed at which you can deliver the video. Let’s call this flattened triangle The Line of Balance. While we think we’ve done a fantastic job of removing the Scale bottleneck with clustering, the other two corners are absolutely still a balancing act.
So with that, let’s take a look at streaming quality and latency in more detail.
Video streams have two properties that determine the quality of a stream: resolution and bitrate. The resolution doesn’t impact bandwidth, but it does alter the look/quality of the video. The way we measure resolution in video streaming is through the number of pixels, or number of pixels wide by number of pixels high. Typical resolutions in video streaming include 1024 x 768; the higher the resolution, the better the picture quality.
On the flip side, the bandwidth used has a direct correlation to the bitrate use. Bitrates are measured by the number of bytes per second, typically kbps (kilobytes per second). This measures the amount of data being sent over time, or the bandwidth being used. The higher the bitrate, the better the picture quality. That being said, if you have a low resolution and a high bitrate, you are basically wasting the bandwidth because the image is really small.
One factor that affects latency is the protocol used. As we discussed in a previous post on HLS, HTTP-based protocols introduce a tremendous amount of latency simply because of the way they are designed. Red5 Pro’s iOS and Android SDKs use RTP for the transport, which allows for tremendously low latent streams good enough for video chat at huge scale. So for us, the limitation is the quality of the stream and the available bandwidth. Simply put, if you don’t have enough bandwidth to push through your stream, it’s going to be choppy and a horrible experience.
According to this study the average global 4G LTE download speed is 9.3MBPS. This gives you plenty of room for a high quality stream to pass through without being impacted. However, not every location in the world supports LTE, and even in areas that do, weak signals can greatly restrict the available bandwidth and thus the speed of delivery to your subscribers.
The best way to deal with low bandwidth scenarios is to add a buffer. A buffer basically just allows the subscribing client to catch up by allowing the bytes of the stream to build up before playing them. In high bandwidth situations you can let super high quality video flow as fast as possible with a very small buffer, but you can’t always predict that your audience will have high amounts of bandwidth, so it’s important to use as high of a buffer as makes sense for your use case. If latency isn’t as important, then use a 3 second buffer; if you need video call style latency, then set your bu
ffer super low (0.2 seconds for example).
In Red5 Pro’s SDKs you can set the buffer property in the R5Configuration class.
The Recommended Settings
Now that we’ve got the details out of the way, let’s get to what everyone wants to know: the settings to create the best possible experience for my app’s users. Below, we’ve broken it down into some High-to-Low settings that can work in many scenarios. This is based on a lot of testing and experience in video streaming over the years.
Adaptive Bitrate Publishing
One of the key things to do with Red5 Pro to get the best performance is to use Adaptive Bitrate Publishing:
This will allow the publishing client to adjust to the available bandwidth and send the highest possible quality. When using Adaptive Bitrate Publishing in Red5 Pro, the R5Configuration.bitrate property becomes the highest bitrate the publisher attempts to use. Don’t forget to set the resolution to an appropriate size for the bitrate setting you’ve chosen. Finally, if your app needs low latency, adjust your subscribers to use a low buffer. The lower the buffer, the more jittery the experience will be in low network situations, but note that our SDKs will re-buffer to keep as close to the buffer time you set as possible. In some extremely low bandwidth settings, our SDK and server will adjust to not send video at all in order to keep up.
Without further ado, here are our recommendations:
Note: if you have Adaptive Bitrate Publishing enabled, choose the Maximum Bitrate setting.
Resolution & Bitrates
1920 x 1080
Maximum 6000 Kbps
Recommended 4500 Kbps
Minimum 3000 Kbps
1280 x 720
Maximum 4000 Kbps
Recommended 2500 Kbps
Minimum 1500 Kbps
854 x 480
Maximum 2000 Kbps
Recommended 1000 Kbps
Minimum 500 Kbps
640 x 360
Maximum 1000 Kbps
Recommended 750 Kbps
Minimum 400 Kbps
426 x 240
Maximum 700 Kbps
Recommended 400 Kbps
Minimum 300 Kbps
Lastly, for more on how to set the resolution and bitrate in our app, please take a look at this blog article.