📜 ⬆️ ⬇️

How we implemented INDE in our Android app


The use of ready-made libraries in applications frees programmers from the unproductive work on the invention of bicycles, reduces the time for the application to enter the market and has a beneficial effect on its functionality. We can illustrate all these theses by our own example. In this article, we describe the process of introducing the Intel INDE ( Integrated Native Developer Developer ) library for working with video in Android into our existing Together Video Camera application, give examples of how the functionality is implemented according to the specific needs of the application, and describe the complexity of development before and after INDE.


Prehistory

Together we started making the application back in 2012. The project was in some way a corporate hobby, since most of the time we worked on projects of our clients, developing various video services and applications. Together was conceived as a simple and convenient video editor with features for organizing a video library (all videos with easy access). Editing opportunities should be as simple as possible, so that users unfamiliar with the concepts of video editors would be comfortable creating short films in the shortest possible time.

Development before the introduction of INDE

At that time, there was almost nothing to do with video for Android, and a little more intelligent video processing required immersion in C code. All the video processing we did was based on the open source software ffmpeg. This approach required a significant investment of time and developer effort. After the launch, we learned that our laboratory of test devices from a dozen devices was extremely scarce, hundreds of poorly documented errors from various platforms came from users. Developing an own client video SDK took more and more time away from the core product, from improving the User Experience. We had to do something.
')

INDE implementation

The news about the toolkit INDE very interested us. We accidentally heard about him at the Droidcon 2014 conference. First of all, the following platform features were important to us:

Even these few points overlapped the results of our own development. Since then, the approach to the design of the functionality of our application has changed radically. Now we are repelled not by what we can give the user , but by what the user would like .
It took us a few days to transfer the existing functionality to work with the library INDE. For example, to trim a video, it’s enough to add a segment when composing MediaComposer:

private void setTranscodeParameters(MediaComposer composer) throws IOException { composer.addSourceFile(inputFile); composer.setTargetFile(outputFile); configureVideoEncoder(composer, videoWidthOut, videoHeightOut); configureAudioEncoder(composer); MediaFile mediaFile = mediaComposer.getSourceFiles().get(0); mediaFile.addSegment(new Pair<Long, Long>((long)startCutTime, (long)endCutTime)); configureRotateEffect(composer); } 

To reduce the video before sending it to the server, it is enough to set the required dimensions in the MediaComposera parameters before starting transcoding. Moreover, even if the dimensions are not proportional to the original video, it will easily fit into the specified rectangle:

 private void configureVideoEncoder(MediaComposer composer, int outWidth, int outHeight) { VideoFormatAndroid videoFormat = new VideoFormatAndroid(videoMimeType, outWidth, outHeight); videoFormat.setVideoBitRateInKBytes(videoBitRateInKBytes); videoFormat.setVideoFrameRate(videoFrameRate); videoFormat.setVideoIFrameInterval(videoIFrameInterval); mediaComposer.setTargetVideoFormat(videoFormat); } 


The video reversal operation, which is not easy in our understanding, takes into account the meta-date of the source in INDE terminology at all only as a video effect.

 private void configureRotateEffect(MediaComposer composer) { if (rotation == 0) return; IVideoEffect effect = new RotateEffect(rotation, factory.getEglUtil()); //       effect.setSegment(new Pair<Long, Long>(0L, 0L)); composer.addVideoEffect(effect); } 


That is, if desired, in the process of transcoding, only part of the video can be flipped, since the effect is superimposed over a certain time period. It is difficult to imagine where this can be useful, but for some reason the opportunity itself makes me happy.

With these few lines of code, we transferred all the previous video functions to the INDE library. And then we threatened something that we could not even dream of before, namely, creating a complete video file of the album on the user's device.

To understand what is at stake, the album in Together is a collection of custom videos and photos, the order of which is easily adjusted on the editing screen. You can also select a suitable soundtrack and put it on top of the entire video series. For especially important video moments, we left the possibility of preserving the original sound with the main soundtrack in the background. An example can be found here .

The main purpose of the albums is to create a video story and quickly share in social networks. Since all the same it would be necessary to upload the video to the network, it was decided to transcode the albums in the cloud even at the early stages of development. Thus, the uniformity of the transcoding process is achieved and the resources of the user device are saved. But then we discovered some problems:

Accordingly, if you additionally transcode an album on the device each time you change the order of the video, it would solve all the problems, and the user would be able to see a very close version of the final film, if not absolutely accurate. This feature is especially useful if the compilation and editing of an album is completely offline, when it is simply impossible to enter the site and see the result. Solving a similar problem with the help of ffmpeg and, most importantly, to achieve an equally working result on all devices would be very time consuming and time consuming. With INDE, it took no more than one man-week.

Album Transcoding

Despite the fact that the MediaComposer initialization process is quite flexible and simple, in our case some preliminary steps were required. In particular, before pasting the album, you need to convert all the images into video. To this end, according to the advice of the developers from INDE, we added a 30-second “empty” video file (80 kb) to the application, which was overlaid with JpegSubstitudeEffect and cut to the required duration. If you need not just to statically display a photo for a few seconds, then you can inherit JpegSubstitudeEffect and convert the coordinates of the photo in each frame so that you get some custom animation.

An unpleasant surprise was the fact that it turned out to be impossible to run several commands to transform photos into video in parallel streams at the same time. There is a limit on the number of simultaneous MediaComposer instances, and for each device it is different. For example, Samsung devices failed to launch more than 3 instances. In this regard, it was necessary to convert images sequentially, which is quite long in time.

Exactly the same preparatory work will have to be done for any video / audio effects, as well as if you need to trim some video before gluing. Accordingly, all this needs to be thought out in advance. After the preparatory work is completed, the gluing itself is very simple:

 @Override protected void setTranscodeParameters() throws IOException { String firstPath = playlist.get(0).localPath; Uri firstMediaUri = new Uri(firstPath); mediaComposer.addSourceFile(firstMediaUri); mediaComposer.setTargetFile(dstMediaPath); configureVideoEncoder(mediaComposer, videoWidthOut, videoHeightOut); configureAudioEncoder(mediaComposer); // add source files in cycle Uri nextUri = null; for (int i = 1; i < playlist.size(); i++) { nextUri = new Uri(playlist.get(i).localPath); mediaComposer.addSourceFile(nextUri); } } 


What could not be achieved

In order to completely repeat on the device the result of album transcoding in the cloud, you should also be able to add a soundtrack over the glued video. In the current version of the library, this has not yet been done due to some problems with SubstituteAudioEffect, which are now being actively discussed with the developers. Therefore, so far the audio track in the application is implemented by a separate player, synchronized with the video track in time.

As a result, there is another functionality that we cannot release yet - this is the effect of the original audio for some videos in the album. It should look like this: several video clips in the album are played with the superimposed soundtrack and the sound of the video itself is not audible, and then the soundtrack calms down and becomes audible, for example, what people are saying in the video, then the soundtrack is turned on again at full volume and playback continues . Again, we look forward to the update to play with this functionality.

Performance

Before deciding to transcode a video on a device, we were faced with the main question: how much will it cost for users in terms of resources. We conducted several tests on existing devices, measuring the time required for the operation, and the cost of the battery. The results are shown below.



In summary, if briefly

The processing of video on the device before and after the introduction of INDE is different like heaven and earth. Usually, when switching from lower level tools to a higher level, the programmer turns out to be limited by the possibilities that the high-level library provides him. In the case of INDE, this is not at all the case; on the contrary, there are more opportunities and it takes significantly less time to implement them. For the developer, only the most pleasant part remains - to think over the architecture, improve the UI and customize the video or audio effects. No more immersion in the C code, working with muxers and demuxers and unexpected results depending on the specific device.

Of course, there remain some points that I would like to improve and correct, but we must pay tribute to the developers of Intel, they are happy to listen to such suggestions and promptly make amendments.

Useful sources

The article specifically shows very short examples, since on the Intel website you can find complete and understandable source codes. software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials

Source: https://habr.com/ru/post/236833/


All Articles