📜 ⬆️ ⬇️

Why did we do VOD on WebRTC


VOD is about video on demand, i.e. playing conventional clips, as is done on YouTube or another streaming service. WebRTC is a low latency realtime video. You may ask - how can these two things be related? Come under kat for details.

Support and bugfix


It all started, as usual, with the sapport. A girl programmer approached us, presumably from an outsourcing company in India, which participated in the development of a mobile telemedicine application. One of the client's desires was recording WebRTC video chat with an iOS application, followed by playback in the same iOS application. Such a recording worked, but when played using standard iOS SDK tools, green artifacts were discovered. More precisely, not even artifacts, but quite distinct rectangular areas of green color, occupying of the screen. Of course, this was no good, and we began to study the problem.


Playback of recorded video from WebRTC via AVPlayerViewController
')
In the mobile video playback application, we used the standard components MPMoviePlayer or AVPlayerViewController, which can play an mp4 movie if you specify its http URL, for example http: //host/sample.mp4 . These components played out the MP4 video normally, but in the case of the broadcasts recorded from the iOS application, the green area did not disappear anywhere and spoiled everything.

WebRTC dynamically changes the resolution of the stream.


It turned out that the green artifacts in the recording were due to the fact that for WebRTC it was normal to change the video resolution on the fly. At the same time, in the mp4 recording file we have wonderful frames of different sizes, or rather, first a sequence of one size 640x480, then another 320x240, and so on. Such tricks are great for playing VLC, for example, without any artifacts, and the built-in components of video playback in iOS over HTTP produce green artifacts when the video resolution changes in the bitstream.

Let's start the broadcast from the iOS application and make sure that this is true. For testing, you can use our iOS mobile application - Two Way Streaming based on iOS SDK and WCS5-EU demo server.

This is the application that streams the stream to the server:


And this is how the resolution change of the video sent from the mobile application in time looks like:


From the monitoring graph of the video stream, it can be seen that the resolution of the picture changes dynamically depending on the mobile device’s ability to compress video, work with the network, etc. WebRTC changes resolution to target low latency.

WebRTC VOD as a solution


The way out of this situation would be rescaling with transcoding, i.e. frame decoding, conversion to one resolution, for example 640x480 and recording in this resolution. But if you do this with every stream published to the server, the CPU resources will quickly run out on 10-20 video streams. Therefore, we needed a solution that did not involve transcoding.

We thought - if WebRTC streams video with such changes in permissions, then it should be able to play the video recorded in this way. It turns out that if we read the mp4 file and feed it via WebRTC to the browser or mobile application, then everything should be fine and the green rectangles on the iOS screen of the application should go.

It remains to implement the recording of recorded mp4 and forwarding to the Web Call Server engine for further conversion into WebRTC. The first tests showed good results - the green rectangles disappeared.

So we got VOD with playback not only via WebRTC, but also on all supported protocols and technologies: RTMP, RTMFP, RTSP, HTML5 Canvas, Media Source, HLS, WebRTC.

WebRTC VOD - Live Broadcast


Then the question arose - “What if users want to broadcast the video as a stream, all at once and at the same time?”.

As a result, I had to do two types of VOD.

The first is personal VOD . For each user who wants to play a video, a separate channel is created through which video is played from the very beginning of the video.

The second is live VOD . If the user started playing the video, and the second user connected later, they will watch the video as a live broadcast in real time, i.e. the server will stream the video from exactly one stream, to which both users will be connected and can simultaneously watch, for example, a football match and comment on the players' actions.

In our player and in the API, in order to play a stream, you need to know its name.

For VOD, we introduced the following schemes:

If you want to play the video personally, transfer the stream name like this:

vod://sample.mp4 

If you want to make a full online broadcast of the video, the name of the stream will be as follows:

 vod-live://sample.mp4 

The sample.mp4 file itself must be located in the WCS_HOME / media server folder and have the format MP4 / H.264 + AAC.

What happened to the iOS application? - you ask.

All is well. The iOS application plays a VOD movie on WebRTC without green rectangles.

WebRTC VOD in the Web Player


Now let's see how WebRTC VOD looks on the web. To do this, on the server side, copy the mp4 file to the / usr / local / FlashphonerWebCallServer / media folder. Let it be the well-known Big Buck Bunny, sample.mp4.

Open the page with the test player , specify the name of the stream vod: //sample.mp4 and click Test Now.


The player starts playing stream through WebRTC. In chrome: // webrtc-internals, you can see the playback graphics:


As a result, the story ended well. We fixed a bug with a green screen when playing a mp4 broadcast recording in an iOS application and made the WebRTC-VOD function for web browsers and mobile devices running iOS and Android.

Links


  1. Test iOS Two Way Streaming application for broadcasts
  2. Web player - an example player for playing VOD - WebRTC streams
  3. Web Call Server 5 distributing mp4 videos via WebRTC
  4. WebRTC SDK for iOS - library for developing streaming applications for iOS

Source: https://habr.com/ru/post/337560/


All Articles