📜 ⬆️ ⬇️

Development of new media platform. Stage of delivering video content to users

Hello to all!

With this article we want to open the cycle of materials about the development of the service, which can be attributed to the class new media. The service is a large group of applications, which includes the means for distributing and playing video content on different platforms, second-screen applications and many other interactive products designed to expand the capabilities of online broadcast consumers.

The topic is quite extensive, so we decided to start a story about the development of a new media service from one of its basic stages, namely, the process of delivering video content to users in live mode. This article will describe the overall solution architecture.
')
Immediately, we note that the solution described below (like the story itself) does not pretend to any novelty or genius, but the topic is quite relevant, the development is just in the process, so it would be very useful for us to get an outside view of the problem.

Task

Any mass event that is covered with the help of several cameras is usually served to the viewer in the director's version: the editing directors glue together a single, non-alternative video series, which is transmitted to the viewer.
We are faced with the task of developing a service that will allow the viewer to choose from which camera to watch the event while watching the webcast, and will provide an opportunity to review the key moments of the event.

image
Figure 1. The overall solution architecture

We have the following scheme:
The master server monitors the broadcast status of broadcast sources and distributes streams among slave servers. Slave servers, in turn, process data streams and send the result to the Storage specified in the configuration, which is received from the master server.

Storage caches current broadcast data, and also saves the stream in FS. The configuration of the Master-server allows you to configure the Storage-servers in different modes - data replication, data type, etc.

image
Figure 2. How do customers broadcast?

A load planner is used for load sharing. The balancer is the client's entry point into the system. It provides client access to servers, and also filters out extra or expired requests. The first client request is always sent to the balancer. The balancer, depending on the settings, redirects the client either to the authorization server or binds to the video broadcast server. Depending on the number of users, the number of balancers may be increased. We use separate instances for downloading historical fragments and online broadcast fragments for load distribution.

image
Figure 3. Caching

At the time of the online broadcast, the server caches fragments of streams before storing them in storage. When clients call, he asynchronously distributes fragments.

Buffering of video fragments on the client takes place using two download queues. The first load fragments of the active stream. After the number of fragments, optimal for smooth playback, is loaded, control is transferred to the second stream, which starts synchronously loading fragments of cameras for this timeline segment. This avoids broadcast delays when switching cameras. At the moment of switching, the buffer of the selected camera is loaded into the main thread queue.

Instead of conclusion

In this material, we have tried to briefly and schematically outline our approach to the design of the real-time video content delivery section to the broadcast consumer. In the following articles we will tell you what we have as a result, and we’ll discuss in more detail how it will interact with other parts of our large new media service (in particular, with its client part).

Thanks for attention :)

Source: https://habr.com/ru/post/242127/


All Articles