A couple of weeks ago there was a release of new iPhones and iOS 11, which it was impossible not to notice. With the release, one more event, certainly an important one for developers, happened: the long-awaited support of WebRTC appeared in the Safari browser.
Imagine for a minute, millions of iPhones and iPads around the world began to be able to realtime audio and video in the browser. Full-featured browser video chats, playback of live broadcasts with a low (less than a second) real-time delay, calls, conferences, and much more have become available to iOS and Mac users. To this a long walk and finally it happened.
It was
Earlier,
we wrote about a way to play video with minimal delay in iOS Safari, and this method is still relevant for iOS 9 and iOS 10, where there is no support for WebRTC technology. We offered to use the approach codenamed “WSPlayer”, which allows you to deliver a live video stream from the server using the Websocket protocol, further decode the stream using JavaScript, and draw the video stream in the HTML5 Canvas element using WebGL. here's what it looked like:
This approach allowed and now allows playing the stream on the iOS Safari browser page with a delay of about 3 seconds, but has the following disadvantages:
')
1. Performance.
The video stream is decoded by JavaScript. This creates a fairly high load on the CPU of the mobile device, does not allow to play high resolutions and recycles battery power.
2. TCP.
The transport protocol used for video and audio is Websocket / TCP. For this reason, it is not possible to target a delay, which may increase depending on network fluctuations.
All this time, until iOS 11 came out, WSPlayer could play video with a relatively low latency (3 seconds), compared to HLS (20 seconds). Now everything has changed for the better, and the JavaScript player is replaced by the native technology WebRTC, which does all the work by the means of the browser itself without decoding into JavaScript and without using Canvas.
It became
With the advent of WebRTC, the low latency video playback scheme in iOS Safari 11 has become identical to other browsers already supporting WebRTC, namely Chrome, Firefox, Edge.
Microphone and camera
Above, we wrote only about playing real-time video. However, video chat cannot be started without a camera and microphone, and this was a major obstacle and a major headache for developers who planned to support iOS Safari in their video chat or other web project with live video. How many man-hours were spent in vain on finding a solution in iOS Safari 9 and 10, when this solution simply did not exist - Safari could not capture the camera and microphone, and this “nebaganofich” was fixed quite recently in iOS 11.
Launch iOS 11 Safari and request access to the camera and microphone. It was this simple thing that we waited for and, as you can see, waited:
The browser asks the camera and microphone and can stream both live streams and play sound and video.
And you can also look at the Safari browser settings and turn on / off the microphone there:
Camera display and streaming video playback
Not without its “features”. The main distinctive feature of video playback in a video element is the fact that to play the page you need to click (tap).
For developers, this is a clear limitation and a brake. After all, if the customer requires - “I want the video to start playing automatically after the page loads”, in iOS Safari this trick will not work and the developer will have to explain that Safari and Apple are to blame for everything with their security policy.
For users, this can be a blessing, since sites cannot play the stream without the knowledge of the user, who, by clicking on the UI element of the page, formally confirms his desire to play this video stream.
What about Mac OS?
There is good news for owners of MacBooks and Mac OS. After the update, Safari 11 for Mac also earned WebRTC. Prior to this, Mac Safari used the good old Flash Player, which allows you to do the work that WebRTC does - to press and play audio and video over the network via the RTMP and RTMFP protocols. But with the advent of WebRTC, the need to use Flash Player for video chatting has disappeared. Therefore, for Safari 11+ we use WebRTC, and for Safari 10 or less, we continue to use Flash Player or WebRTC plugins as a fallback mechanism.
Specify the status
As you can see, Safari 11 added support for WebRTC, and for Safari 9 and 10 there were fallbacks in the form of Fash Player and WebRTC plugins on Mac OS, and also WSPlayer on iOS.
Mac, Safari 10
| iOS 9, 10, Safari | Mac, Safari 11 | iOS 11, Safari |
Flash player
WebRTC plugins
Wsplayer
| Wsplayer | Webrtc | Webrtc
|
We are testing the broadcast from browser to browser
Now let's check the main cases in practice, and let's start with the player. First, install the iOS 11.0.2 update with the new Safari.
So, as a first test, Chrome for Windows will broadcast the video stream to the server, and the viewer on iOS Safari will play the video stream via WebRTC.
Open the Two Way Streaming example in the Chrome browser and send a WebRTC video stream with the name 1ad5 to the server. Chrome captures video from the camera, presses to the H.264 codec in this case, and sends a live stream to the server for subsequent distribution. Broadcast video looks like
this :
To play, specify the name of the stream and the player in iOS Safari starts to play the stream that was previously sent to Chrome by the server. Playing a stream on an iPhone in Safari browser looks like
this :
The delay is invisible (less than a second). The video stream is played out smoothly, without hints of artifacts. The playback quality is normal, you can see on the screenshots.
And this is how video playback looks in the same example,
Two Way Streaming in the Play block. Thus, one stream can be broadcast, and the second can be played on the same browser page. If users know the names of each other's streams, we get a simple video chat.
We are testing webcam and microphone broadcast using iOS Safari
As we wrote above, the main feature of WebRTC is the ability to capture the camera and microphone in the browser and send it over the network with targeted low latency. Checking how it works in iOS Safari 11.
We open the same demo streamer example in Safari that we opened in Chrome. We get access to the camera and microphone. Safari displays a dialog that prompts you to enable or disable the use of a camera and microphone.
After allowing access to the camera and microphone, we will see a red camera icon in the upper left corner of the browser. So Safari shows that the camera is active and in use. In this case, the video stream is sent to the server.
We are taking this stream in another browser, for example Chrome. On playback, we see a stream from Safari with a cursed vertical shot, and all because the device is not turned into a horizontal position.
After changing the orientation of the iPhone, the stream playback picture becomes normal:
Capturing and streaming video is always technologically more interesting than playback, because it is here that important RTCP feeds take place, which target delay and video quality.
At the time of this article, we did not find suitable tools for monitoring WebRTC in the browser for iOS Safari, similar to webrtc-internals for Chrome. Let's see how the server sees the video stream captured from Safari. To do this, we turn on monitoring and check the main graphs that describe the traffic coming from Safari.
The first cut of the graphs shows such metrics as NACK and PLI, which are indicators of the loss of UDP packets. For a normal network, the number of NACKs, shown in the graphs, is insignificant, about 15, therefore we consider that the analyzes are within the normal range.
FPS video stream ranges from 29,30,31 and does not sink to low values ​​(10-15). This means that the iPhone hardware accelerator has enough performance to encode video in H.264 codec, and there is enough processor to stream this video to the network. For this test, we used the iPhone 6, 16 GB.
The following graphs show how the video resolution and bitrate change. Video bitrate varies in the range of 1.2 - 1.6 Mbps, video resolution remains unchanged 640x480. This suggests that there is enough bandwidth to encode the video and Safari presses the video with the maximum bit rate. If desired, the bit rate can be clamped within the desired limits.
Next, we check the bitrate of the audio component of the stream and the statistics of audio losses. From the graph it is clear that the audio is not lost, the loss counter is at zero. Audio bitrate is 30-34 kbps. This is the Opus codec, with which Safari presses the audio stream captured from the microphone.
And the last chart is timecode. According to it, we estimate how synchronously the audio and video sets. If there is no synchronicity, then visual desynchronization becomes noticeable when the voice does not keep up with the lips, or vice versa the video goes forward. In this case, the flow with Safari comes perfectly synchronously and monotonously without the slightest deviation.
From the presented graphs, a picture typical of WebRTC and a behavior very similar to the behavior of the Google Chrome browser is visible: the NACK and PLI feedbacks come in, FPS varies slightly, the bitrate floats. That is, we get the WebRTC that we’ve been waiting for.
Pay attention to changing the height and width. For example, if you change the position of the device to the horizontal, the resolution of the stream will change to the opposite, for example from 640x480 to 480x640, as in the graph below.
The orange line on the graph shows the width, and the blue height of the image. At 05:21:17, we turn the iPhone, which streams the stream, to the horizontal position and the resolution of the stream changes exactly to the opposite 480 in width and 640 in height.
We test video playback from IP-camera, in WebRTC for iOS Safari
An IP camera is most often a portable Linux server that provides streams using the RTSP protocol. In this test, we take video from an IP camera with H.264 support and play this video in the iOS Safari browser via WebRTC. To do this, in the player shown above, enter instead of the name of the stream, its RTSP address.
Playing a stream from an IP camera in Safari via WebRTC looks like
this :
The video in this case plays back smoothly, without any problems with the picture. But there is still a lot depends on the source of the stream - on how the video from the camera will go to the server.
As a result, we successfully tested the following 3 cases:
- Broadcast from Chrome browser to Safari
- Capturing a camera and microphone from Safari to Chrome
- Playing video from an IP camera in iOS Safari
A little about the code
For broadcasting video streams, we use a universal API (
Web SDK ), which in terms of translation looks like this:
session.createStream({name:'stream22',display:document.getElementById('myVideo')}).publish();
Here we set the unique name of the stream stream22 and use the div element:
<div id='myVideo'></div>
To display the captured camera on a web page.
Playing the same video stream in the browser works like this:
session.createStream({name:'stream22',display:document.getElementById('myVideo')}).play();
Those. denote the stream name again and specify the div element in which you want to place the video for playback. With the subsequent call of the play () method.
iOS Safari, is currently the only browser in which you need to click on the page element to make the video play.
Therefore, we added a small code, specifically for iOS Safari, which “activates” the video element before playing the stream, by calling the following
code :
if (Flashphoner.getMediaProviders()[0] === "WSPlayer") { Flashphoner.playFirstSound(); } else if ((Browser.isSafariWebRTC() && Flashphoner.getMediaProviders()[0] === "WebRTC") || Flashphoner.getMediaProviders()[0] === "MSE") { Flashphoner.playFirstVideo(remoteVideo); }
This code in the standard player is called by clicking on the Play button, and thus we fulfill the requirement of Apple and correctly start playing.
Flashphoner.playFirstVideo(remoteVideo);
In conclusion
iOS 11 Safari has finally received WebRTC support, and this support is unlikely to be fixed in the next updates. Therefore, we boldly use this opportunity and make real-time streaming video and calls in this browser. Install further iOS 11.x updates and look forward to new fixes and feature bugs. Good streaming!
Links
WCS - the server with which I tested broadcasts on iOS 11 Safari
Two Way Streaming - An Example Translator
Source Two Way Streaming - source tape stream
Player - an example of a player
Source Player - player sources
WSPlayer - play low latency streams in iOS 9, 10 Safari