In our iOS and Android SDKs we have a property called `buffer_time` on the `R5Configuration`. It might not do exactly what you would think. First of all, the property behaves differently on a publisher client than it does on a subscriber. So, let’s take a deep dive into the `buffer_time` property and gain a better understanding about how to use it to improve your streaming.
When setting up a configuration for your publishing client, the
buffer_time property is actually referring to the amount of time you allow packets to back up before flushing them. In the best case, you are on a very good network, and the packets go out as soon as they are created by the encoder. However, real world network connections are far from perfect especially considering mobile clients. So, when the network hits traffic and the queue of packets grows, you can either choose to dump the ones in the queue or keep them around and let them continue to back up. This is what the
buffer_time property does. You set the time in seconds as to how long you want to hold on to the packets in the queue. The larger the value, the more time it will keep them around and the lower the value, the less time it will keep them around.
Why You Should Care
If your network can’t keep up, the mobile app is forced to dump packets, which according to Murphy's Law, probably contain the best parts of the video leaving disappointed fans or misinformed employees. The more frequently you dump packets, the less of the video gets across which could create a jittery video suffering from artifacts.
Since frequently dumping packets is bad for video quality, why not just set the
buffer_time to a crazy high number and not worry about it? That way, packets will never get dumped. Well, the problem is latency. If you keep storing packets in the queue and they can’t get out, then the queue will be held back by this large number, which could equate to seconds (perhaps many) that someone on the other side won’t see you. You won't drop packets but your UX will suffer.
As mentioned already, the
buffer_time is different if you are implementing it in a subscriber client. In that case, it acts pretty much like what most people think of as video buffering. It’s the amount of time you allow video packets to build-up on the server side before delivering to the client. In the case of receiving video, the higher the
buffer_time the more you allow for backup in network congestion. The lower the number, the less time you allow for this queued video to get through. If you’ve passed the allotted buffer time and packets are still lined up on the server, then the client tells the server to dump it and send newer video packets.
Why You Should Care
Once again, like in your publisher client, the buffer value you set on the subscriber will control how choppy the video is if your network conditions don’t allow it to stay in real-time. Conversely, if you have a high buffer time you will likely see a smoother video, but this will add to your latency.
Now all that said, we’ve done a really good job behind the scenes on dropping packets and keeping the experience to look as smooth as possible. Nonetheless, there may be times when the network gets so bad it might start to feel like you're watching frame by frame flash cards rather than a true video. However, Red5 Pro features built-in logic that allows the server to stop sending video altogether if you can’t keep up with the set buffer time. This will create an audio-only stream, and you will get an event letting you know this happened.
Summing it All Up
As you can see, the
buffer_time property can really change the video streaming experience for your users. In the end, it’s a balancing act between latency and quality. The higher you set your video quality, the more bandwidth becomes a controlling factor. In those cases, you might ensure your customers are in good bandwidth conditions before streaming by sending a notification if their bandwidth is below a certain threshold. An additional option would be to set higher buffer times. Of course, the trade off is that the latency could grow increasingly higher if you let it. For some use cases that might not be a concern, but many others may find it more problemeatic.