Live streaming stereo video to VR glasses (Oculus Go)
We will not do a lengthy introductory part, let's get straight to the point.
So, there is a stereo camera that can transmit H264 video using different protocols. There are glasses Oculus Go. How to watch stereo live stream from camera in VR glasses? Preferably, with minimal delay and locally, so Youtube and other RTMP video services disappear.
Looking ahead, that's what happened. At the beginning - playback of a previously recorded video file from the stereo, then playback of the live stream from StereoPi (MPEG-TS via UDP). ')
The stereo camera that I use is StereoPi, so I will give specific examples with reference to it. In fact, this is a common raspberry, but with two cameras, so you can try the described examples on a regular raspberry, if you really want to. True, you will need to install the firmware from StereoPi.
First of all, there was an attempt to make a regular Android application that plays the stream from the camera to full screen, and pour it into the oculus using the sideloading method (via adb).
After some tinkering with the manifest, the glasses agreed to consider this application native. It appeared in the “Unknown Sources” in the library, it started, showed everything it needed, but there was a problem — the head movements were not taken into account, the video from the camera was just stupidly displayed on the full screen with glasses. The stereo effect was, yes, but it was enough to move my head a little as Moscow started to go crazy, which caused very, very uncomfortable sensations.
If that, here are the .apk applications: StereoPi for Oculus Go At the same time in the archive and adb lies, so you can immediately try to fill in the glasses. Just commanding
adb install StereoPi.apk
After that we go to the Library -> Unknown sources, the application com.virt2real.stereopi should appear there
We launch and, if StereoPi is in the same local network as the glasses, we immediately see the stereo image from the camera.
But this is garbage ... I would like a normal native app for oculus for watching videos. So that there was a fixed screen and so as not to storm when moving your head. I am not yet ready to master Unity for oculi, so I had an idea to try to use the video player applications already in the Oculus store. I usually watch 3D movies in the Skybox , so I tried to use it.
In addition to the usual viewing of media files from a built-in flash drive and from network devices, an interesting item “Airscreen” was found in the Skybox. It turned out that you can install the Skybox application on a computer with Windows (well, or on a Mac), feed video files to it, and then it becomes possible to watch these video files with glasses. Those. the Windows application is a video server, and the glasses are the client. I did not find the communication protocol anywhere, so I had to uncover tcpdump.
After a brief excavation, it turned out that Skybox uses UDP broadcast messages to find the server in LAN. The message looks like this:
And after that in the Skybox with glasses we will see our StereoPi. Next will be a bunch of requests for which you need to send answers. Playlist content, for example.
This is especially interesting because In the playlist that forms the Windows application, the cherished abbreviation RTSP was discovered. It turned out that the server application streams video files via RTSP, which is already suitable for live streaming video, which we need. More precisely, it turned out that “RTSP” is in the playlist, but the links to the video files are regular http. Those. the server application still gives the files via HTTP, and this does not suit us. At this moment I was already upset, but I thought, why not try to give a link in the playlist in a format in which VLC usually understands, that is, rtsp: //192.168.1.51: 554 / h264 And cheers, Skybox began to play the video stream from the RTSP server on the stereo. The delay is very big, about 20 seconds, so we pick further. We try to feed a UDP stream to MPEG-TS. Again, VLC usually eats it via the link udp: // @: 3001, for Skybox I tried to specify in the same way. Then it remains only to send the MPEG-TS stream to the points host and the specified UDP port. For this, GStreamer is involved:
In the skybox we click on the element of the “Live Stream MPEG-TS” playlist and voila, we see the live MPEG-TS stream on the big screen in a virtual cinema. The delay is much less than with RTSP, 2-3 seconds, but still much more than in my simple application, which receives a raw H264 stream via UDP (there is usually a delay of 100-150 ms at 720p resolution).
Then I came up against a dead end, while I still can’t reduce the delay. Perhaps you need to disable buffering in the Skybox itself, I will try to write to the developers, they can make the “Disable buffering” option :-)
Finally
In general, if suddenly, for some reason, you suddenly needed to watch a live video stream in oculus or other VR glasses (Skybox is available on many platforms like) - you can try the method I described. I do not know whether it will ride with other stereo cameras, but with StereoPi checked, plows.
Oh yeah, I almost forgot. If suddenly someone can help with the native application under the oculus (so that approximately how the Skybox looked) - write in a personal, we will discuss the details.