This is the story of a video streaming project.

Interesting customer
I sat in front of the monitor for an hour, and maybe two. It all started with a link to someone's twitter, which a colleague kindly threw me in Skype. Then he accidentally opened a news site, then Facebook, during which time a couple more news had appeared ... In general, the back was already sore and it was time to go warm up. The office was cool, the air conditioning was quiet. I didn’t want to go out into the heat of the street and, unbending, I walked to the nearest vending machine. Somewhere at the reception bell rang.
')
A couple of minutes later I saw Olga accompanying an Asian-looking gentleman. He looked about fifty. On a slightly wrinkled head sat a gray hat with short brim. They obviously came to me. Having come up with the coffee machine, which was already splashing into my cappuccino cup, the gentleman said in broken Russian:
Hello, I’m a relative of the WebRTC project. My name is Sukonako , and held out my hand. What brought this Japanese here, I thought, responding to a handshake, and invited the guest to his office. Then we had to switch to English, which both of us were more understandable.
We collect requirements
I:
So, how can I be useful?S:
We have been working since 2000 in streaming and Flex for a large number of users. We use Adobe Flash Media Server (FMS) and now we would like to use WebRTC.Me:
Can you elaborate on what you would like to achieve using a WebRTC server?S:
We need a regular media server that can receive video streams from one user and transfer them to other users. We want a video chat.Me:
No problem, we can make a solution based on one of the WebRTC servers.S:
Adobe FMS suits us completely. We would like to expand the range of our users on WebRTC without removing the FMS. It works well. A tablet appeared in Sukonako’s hands, he pushed it to me and pointed his finger at the following diagram:

C: The
Flex App is a doctor, Flex Apps are patients, the doctor uses a webcam and consults several patients at once. One of the patients can request a private consultation from the doctor, and then the doctor is left alone for private consultation with the patient. At this point, the rest of the patients cannot communicate with the doctor and do not see him.
Strange, I thought. What the doctor can simultaneously advise patients. One complains that the ear hurts, the second complains about the tonsils, then the one that presses the button with the glands and goes into private consultation. However, the principle is clear. Focus on the technical side of the issue.

Me:
Ie as a result, is installed two-way video of the doctor and patientS:
Not really. The patient always sees the doctor, but the doctor may not see the patient. By default, patient video is disabled.Me:
So the video is one-sided - from doctor to patient?S:
In most cases, yes. But sometimes patients want to show their video to the doctor. This happens infrequently.I:
Everything is clear. It means that both the doctor and the patient can use the WebRTC browser, such as Firefox or Google Chrome, as well as IE, which will work through the FMS. Right?S:
Almost like that. All doctors use our Flex streaming application. Patients should use either our application or WebRTC.Me:
Ie ideally, should your application look like this?And sketched a scheme.

S:
Yes, that's right. That should work that way. On the one hand, native Flex-application, and on the other WebRTC-browser. We are primarily interested in browsers on Android-smartphones and iOS-devices. You probably know that Flash is somehow present on all desktop browsers: IE, Chrome, Firefox, Safari, but it is not on Android and iOS. We would like to make our service available on mobile browsers and keep what works well on desktops, i.e. FMS.Me:
WebRTC will work on Android browsers, and on iOS, the problem is that WebRTC in iOS browsers does not work due to platform limitations. Those. we cannot deliver WebRTC video stream to iOS and cannot stream video from the webcam of the iOS browser.C:
Wait, I know that Safari does not support WebRTC, but its support is in Google Chrome.Me:
Yes, but not in the case of iOS. There, Chrome stumbles upon the technical limitations of the platform and does not have the same features for working with WebRTC video as on the desktop. Those. iOS browser in this case is not suitable. Why don't you upload your own app to the Apple App Store? Then iOS users will only need to install the application and use the pure WebRTC, which is used in Google Chrome?S:
Unfortunately, we cannot upload the app to the App Store for internal reasons. In addition, we would like to give our users (patients) the opportunity not to install additional applications on their iPhone or iPad, but to work directly from the browser. What options do we have?At this point, I thought about the "internal causes" that do not allow to publish the application in the App Store. Perhaps the area of medical consultation is regulated by law, and it is really not so easy to roll out this kind of application in the App Store.
In reality, there were not many options, and the best of them was a native application with WebRTC support. iOS Safari, as you know, supports HLS (Apple HTTP Live Streaming), but this option was discarded because in communicating with the patient a certain real-time and live communication was assumed, for which HLS is absolutely not suitable for a delay of about 20 seconds.
There was a last option: webkets. Websockets is now supported in almost all browsers - this is actually a TCP channel through which you can deliver video with a delay comparable to RTMP, that is, 3 seconds, not 20. With delivery, it is clear. I’d still play this stream in the <video /> HTML5 element and everything would be great.
Me:
Option, it seems, only one - web sockets. And in this case, patients will not be able to send their video to the server. Only one-way delivery from doctor to patient is possible. You can try HLS, but there is a delay of more than 20 seconds and probably it will not work for you.S:
Good. Did I understand correctly that we will be able to play live streams with FMS directly on the iOS Safari browser? Let it be without WebRTC, but with a slight delay, comparable to RTMP?Me:
Yes, absolutely true. But we need some time to check it out. Let's agree, let's say on Monday, and I'll show you a demo.S:
I would like to see the ability to integrate FMS simultaneously with WebRTC and with Websockets, to be sure that this will work on iOS and on Android. Is it possible to do so?Me:
Yes, I think everything will work out.C:
Thanks for the consultation. On Monday I’ll come in at 10 if you don’t mind and discuss all the issues already having a demo.Me:
Yes, of course, by this time everything will be ready.We are looking for a solution
As you can see from the conversation, the requirements have changed a bit and now it was necessary to fasten two delivery methods to Adobe AMS: WebRTC for browsers on Android and Websockets for Safari browser under iOS. It remains to find the missing element that will allow you to build this demo and link all the technologies and protocols involved in it.

After seeing the Japanese guest, I first went to look at the Adobe AMS specification. I found a lot of interesting things in it, except for the words WebRTC and Websocket.
Further, without hesitation, I scored three useful keywords in Google: rtmp, webrtc and websockets. Google spat out several relevant sites. As it turned out, only two were suitable: the proprietary project
Flashphoner and the description of the open-source prototype from the
Phoboslab website

First candidate
I decided to start with Phoboslab, where all the colors described the problem of stream playback in iOS Safari and offered a solution that is very similar to open source.
The solution is built on ffmpeg, node.js and client javascript for decoding and playback of the video stream. All the components were really open-source and the scheme looked promising.
I raised the virtual server to DO, compiled ffmpeg and installed node.js. All this took about two hours.

Video in iOS Safari really played, and played not bad. My iPhone5 was warming up a bit, but JavaScript steadily patched the video traffic from Websocket and showed it on the web page.
Actually, the decoding of the stream was performed on JavaScript and then rendered in the page element of the iOS Safari browser. The following questions remained open:
- How to pick up stream from FMS
- How to add sound to the stream
- What about WebRTC
And here I was waiting for some disappointment. It turned out that the JavaScript player plays (renders) only the video. For audio, you would have to allow an additional stream, after which they need to somehow be synchronized. But this decision did not provide for this. Thus, this solution was not suitable for transmitting video from the doctor due to the lack of sound.
Second candidate
The next test subject was a Web Call Server with the promised support for RTMP, WebRTC, Websocket protocols. It remains to check whether the support for these protocols is applicable in my particular case and how it works.
First of all, I decided to check the conversion of the RTMP video stream into Websocket, by analogy with the previous test. After all, if this can be done, then you can redirect the RTMP stream from the FMS to the Web Call Server and thus solve one of the tasks.

I armed myself with an iPhone and opened one of the demo pages, where it was suggested to try to do it from a demo server. According to technical support, Web Call Server can be quickly installed on your Linux system, but it would take some time, and the demo would give you the opportunity to immediately understand whether it works or not.
The demo interface was a regular Flash application with a nondescript design and simple functionality called Flash Streaming.

From this Flash application, you can connect to the server via RTMP protocol and publish a stream from a webcam. Publish means to capture a video stream from a browser’s webcam using Flash Player and send data to the server in real time via RTMP.
Judging by the statuses of 'Connected' and 'Publishing', the connection was successfully established and the stream from the webcam was sent to the server. In order not to shine my face in the stream, I used a virtual camera and a series from the fifth (?) Season of Game of Thrones.
Then it remains to see and hear this video on the iPhone in the Safari browser. For this, it was recommended to use a separate player called WS Player Minimal.

The player managed to get a decent picture and sound without distortion or rassinhrona.
Perhaps I have made some progress in my research:
- Managed to test the delivery of stream RTMP-Websocket
- Stream was with sound and video and correctly displayed on Safari browser
It remains to check whether the stream plays in WebRTC and it was possible to proceed to integration with Adobe FMS. To test the same stream with WebRTC, I opened the Streamer And Player Minimal demo in the Chrome browser and performed the same simple procedure, inserting the stream name and clicking on the playback.

Kakaova was my joy, Khalisi!
Now in my arsenal was the delivery of the RTMP stream to Chrome, which means to Android via WebRTC, and to iOS Safari via web socket. Both in that and in the other case the picture was quite smooth, with sound and theoretically suitable for the deployment of a consulting service.
The next question was working with FMS. It is clear that the RTMP protocol should be the same for all implementations, but it was necessary to find out a) Can the FMS redirect the RTMP stream to Flashphoner and b) whether Flashphoner will accept this stream as it received from Flash in the tests above.
Integration with Adobe Media Server
FMS had to tinker. His installation and tests took me a couple of hours. The first thing I did was test the FMS using FMLE and make sure that I installed and configured the FMS correctly, and the RTMP video streams run on it without any obstacles.

The next step was to configure the redirection of the RTMP stream to Flashphoner. Here I had to strain my brain a little, arm myself with Adobe's Action Script documentation and implement such a script: main.asc.
var wcsServer = "wcs5-eu.flashphoner.com"; var netConnections = new Object(); var streams = new Object(); application.onConnect = function (client){ trace("onConnect "+client.id); var nc = new NetConnection(); var obj = new Object(); obj.login = "Alice"; obj.appKey = "flashChatApp"; nc.connect("rtmp://"+wcsServer+":1935",obj); nc.onStatus = function(info){ trace("onStatus info.code: "+info.code); if (info.code=="NetConnection.Connect.Success"){ trace("connection opened "+wcsServer); } } netConnections[client.id]=nc; return true; } application.onDisconnect = function (client){ trace("onDisconnect "+client.id); var nc = netConnections[client.id]; if (nc){ nc.close(); trace("disconnected "+client.id); } } application.onPublish = function(client, myStream){ trace("onPublish "+myStream.name); var nc = netConnections[client.id]; ns = new NetStream(nc); ns.onStatus = function(info){ if (info.code == "NetStream.Publish.Start"){ trace("It is now publishing "+myStream.name); } } ns.attach(myStream); ns.publish(myStream.name); streams[myStream.name]=ns; trace("publish stream "+myStream.name+" to: "+wcsServer); } application.onUnpublish = function(client, myStream){ trace("onUnpublish "+myStream.name); var ns = streams[myStream.name]; if (ns){ ns.publish(false); trace("unpublished "+ns.name); } }
The script is quite simple and is committed to delegating incoming connections and video streams to the FMS to the Flashphoner server. For example, when a connection is received from an application in the onConnect method, a connection to the Flashphoner server via RTMP is created.
When receiving the onPublish video stream, the same video stream is published on Flashphoner.
When disconnecting and stopping the streams, the corresponding calls are delegated to release resources. This is how I got a bridge over which traffic between FMS and Flashphoner will be propagated for further distribution via WebRTC and Websockets.

To test this composition, I took the already familiar Flash-interface Flash Streaming. The only difference was that it was necessary to specify the RTMP address of the FMS server and then rely on the main.asc script, which delegates this video stream to Flashphoner. In my case, this address was rtmp: // my-fms: 1935

When testing, I had to pretty much gobble up the server-side script main.asc from ignorance of Action Script and server programming under FMS, but this is all in the past, and the current version of this script is available in the listing above. FMS did not let down and handed over the RTMP stream to its destination, which made it possible to successfully play it in Chrome and later in Safari.

Installing Web Call Server
As a result, the demo was ready. It remained to put the Web Call Server on its system to avoid failures at the time of the presentation. You never know what they can wind up there until Monday.
The site developer found the installation instructions, consisting of five points. I omitted the fifth paragraph with the installation of SSL certificates, since I have not planned to use WebRTC streaming from the camera and microphone.
- Download Web Call Server 5
- Install using the 'install.sh' script
- Launch using 'service webcallserver start'
- Open the web-interface http: // host: 9091 and activate your license
Installation went fine. I prudently disabled Firewall on the test server (service iptables stop) to eliminate problems with non-traffic.
A minute after the server started, it turned out to open the web interface with the
http: // host: 9091 admin panel , activate the test license and get a demo server similar to this one on Ubuntu:

An additional test run allowed to make sure that the circuit works in my environment. With a sense of accomplishment, I finished the work day and set a reminder - once again check the tests on Monday at 9:00 am before the arrival of Sukonako.
How was the migration and how the matter ended, I can write in the second part, if it will be interesting.
Links to tools used:
- FMS ( Flash Media Server ) aka AMS (Adobe Media Server) - RTMP media server.
- DO ( Digital Ocean ) - hosting of virtual servers.
- WCS ( Flashphoner Web Call Server ) - WebRTC, Websocket media server.
- FMLE ( Adobe Flash Media Live Encoder ) is a client for checking RTMP connections to the server.
- Phoboslab is an open source prototype of Websocket streaming on Safari in iOS.