📜 ⬆️ ⬇️

Developing an IR remote control for a camera



After reading the article on Habré “Making the IR remote control for a camera” , I wanted to share the experience of developing an IR remote control for cameras as an Android application (from idea to publication).

Idea


Having bought a mirror camera, after a while you realize the need for manual tuning, a tripod and a contactless release of the shutter, which is carried out with the help of additional equipment offered by the manufacturer. But, realizing that in your Android-phone there is an IR, which you can remotely release the shutter, you go to Google Play. There, a search for "camera remote" immediately gives a lot of applications, among which there are suitable ones.

You download, you try - yes, they work, but ... the usability and the interface leave much to be desired. In addition to the above, in my model of the camera there is a lack of the function of counting the ex-pairs for shooting at night. Therefore, I decided to implement my own application in order to understand what Android is and what it eats with.
')

Testing IR on the phone


The implementation of an application that performs a contactless shutter release, took no more than one hour. During this time, the Activity was created with one button and a way was found to send signals to the camera, which looks like this:

ConsumerIrManager consumerIrManager = (ConsumerIrManager) this.getSystemService(Context.CONSUMER_IR_SERVICE); int frequency = 38400; int[] pattern = new int[]{ 1, 105, 5, 1, 75, 1095, 20, 60, 20, 140, 15, 2500, 80, 1 }; consumerIrManager.transmit(frequency, pattern); 

What just happened? A “program” was created that performs the function of a device that Nikon wanted to sell me for $ 30.

TK


I thought, “ What could be simpler than the implementation of a single button? ”, But that was not the case.

Analysis of competitors has led to a long list of their opportunities and shortcomings. I will give only the main ones.

Opportunities:


Disadvantages:


Application Requirements:


Some tasks are beyond the scope of the remote control, but I wanted to try as many Android features as possible.

Realization or "Walking on underwater mines"


Work with graphics


When developing, I adhere to the philosophy: "More vector, less raster graphics." Therefore, the whole project required “two” rasterized images - an application icon and a notification icon (not counting images for Google Play). Everything else is free stock fonts and icon fonts. Because of this, the interface looks the same quality on most devices, regardless of the screen resolution.

In order to achieve maximum application performance, I usually refuse animation. But its absence made the application work unobvious for the end user. Therefore, to achieve the "wow effect" animation is present. Due to the fact that the interaction with the application takes place around one button, it was as stylized as possible, and the shooting process is displayed around it.

After working with WPF and Silverlight, it seemed to me that Android contains a very poor set of graphic primitives for building independent vector objects (objects that themselves know how to redraw themselves - without having to draw them manually on the canvas). Therefore it was necessary to make several iterations of the implementation of the ProgressBar ring.



Strangeness: If on some devices call the ClipDrawable object setRotation () with angles other than 0, 90, 180, ..., in which there are other graphic objects, they become fully visible regardless of ClipDrawable.setLevel ().

Work with notification


I wanted to create a notification, and change the state of progress of the shooting in it. But instead you have to constantly re-create it, due to the limitations of Android. The speed with which the time remaining until the next triggering changes causes one to do this often, which made it impossible to use the icon with a higher resolution.

Work with hardware


Developing this type of application in a standard emulator is difficult and time consuming, in more nimble counterparts it is simply difficult. The presence of a Samsung S4 shovel , made it easier to work with the hardware of the device. The emulators failed to achieve IR support, and there were very large brakes with the microphone, after which the sound level behaved no less unpredictably.

After the release, I wanted to refine other, more mysterious ways of "pressing a button" for ordinary mortal users. The idea was to release the shutter of the camera without touching the phone, and in the best case with the display off.



Demonstration scenario: fix the camera in the distance and put the “off” phone in front of you, after which the camera is triggered from the “magic” gesture. For greater effect, you can turn on the flash. It is checked, it looks with surprise and exclamation “Wow!” (The latter depends on the emotionality of a particular person).

The first thing that came to mind was to use the Air Gestures first demonstrated by the Samsung S4. The SDK search has led to disappointment: Devices with Android 4.3 Jelly Bean (API level 18) or higher support Gesture except Galaxy S4 due to a hardware issue. But, I remembered the accelerometer and, later, about the illumination.

Initially implemented triggering when the device transitions from a vertical to a horizontal position. Everything worked fine until the phone was disconnected from the computer. When the screen is turned off, the accelerometer “falls asleep”, i.e. Signals about a change in the position of a device in space cease to be generated. It was not possible to eliminate this effect either by working in the background service or by the power saving settings.

Implemented triggering on a sharp change in light. This sensor does not “fall asleep” when the display is off, but the number of generated signals decreases markedly. It is due to the presence of this sensor in the device, the previously written script works.

There was an idea to connect cameras for recognition, for example, winks. But, I could not find the sane free library for this.

Process in general


In most cases, the Android platform behaves as expected. There were no problems in determining the location, receiving data from the weather forecast server, implementing the application life cycle, using the background service and working with the database.

IDE


In connection with the work, I had to change Visual Studio to Eclipse, which periodically enrages to this day. But, for developing the application, I installed Android Studio with the hope that “Studio” in the title means something. From the first created methods and variables, I realized that this is what I was waiting for: code spelling, smart autocomplete, a huge number of hints and much more. Thanks to the developers for a quality product!

Publication


He created an account, paid a fee, read the recommendations, set up a release version, created a signed application, prepared a description in several languages ​​(EN, RU, UA, NO), took photos of the application and laid out it as a beta version.

Surprise was the mandatory image for Google Play size of 1024x500, although in the articles they wrote that it "does not hurt".

When compiling the release version, ProGuard was used, which disrupted the operation of a certain part of the code based on the names of the classes that it successfully obfuscated. The problem was not immediately detected, since when launching the release version, the “debuggable” parameter is “false”. Errors popped up, processed and everything worked on.

Due to the fact that the application for the micromarket, I did not plan to write an article on Habr, if it were not for the problem with IC. At first he wrote on the forums for photographers, but the result was just a ban. Mostly helped youtube. Then came the first real users, and with them the problems.

Details about the torture of working with the infrared port


Returning to the code to send pulses:

 ConsumerIrManager consumerIrManager = (ConsumerIrManager) this.getSystemService(Context.CONSUMER_IR_SERVICE); int frequency = 38400; int[] pattern = new int[]{ 1, 105, 5, 1, 75, 1095, 20, 60, 20, 140, 15, 2500, 80, 1 }; consumerIrManager.transmit(frequency, pattern); 

Everything was good until it turned out that IR support appeared in Android only from API version 19. What happened before that? Quite a lot of devices with IR were released. To manage them, manufacturers either created their own libraries (HTC OpenSense IR API and LG QRemote IR SDK) or published sample code to work with their IR (Samsung). What happened after? New versions have been released, and the signaling method standardized in API 19 has changed.

I managed to get the application to work with IR ports on Samsung with versions of Android KitKat and Lollipop. After many attempts at remote testing with the HTC One M8, I realized that it would be very difficult to implement support for other devices myself. Therefore, I created a project on GitHub that anyone can download: AndroidInfraRed .

A bit of a boring theory.


For remote control of the device using IR, you need: a transmitter and a sequence of signals that must be sent. In my case, the device is the Nikon D7100, the transmitter is the Samsung S4.

The sequence is defined by two parameters: the frequency (Hertz) and the pattern, which represents a certain time sequence of the presence / absence of a signal of a given frequency.

Example (for Samsung devices with Android version> = 4.4.3)

Example (for Samsung devices with Android version <= 4.4.2)




Details about the project


The AndroidInfraRed project created in Android Studio includes:


The library consists of:


How it works


Description of lines of code from MainActivity.java

1. Choose one of the three logging methods


 // Log messages print to EditText EditText console = (EditText) this.findViewById(R.id.console); log = new LogToEditText(console, TAG); log.log("Hello, world!"); // Log messages print with Log.d(), Log.w(), Log.e() // LogToConsole log = new LogToConsole(TAG); // Turn off log // LogToAir log = new LogToAir(TAG); 

By default, the selected text display device

2. Create an object of the class InfraRed


 infraRed = new InfraRed(this, log); 

For this you need to transfer Context and Logger

3. Initialize the transmitter


 TransmitterType transmitterType = infraRed.detect(); infraRed.createTransmitter(transmitterType); 

The "detect ()" function returns one of the values ​​of the TransmitterType enumeration. The options are Actual, HTC, LG, and Undefined. In the case of Actual, the ConsumerIrManager class provided by Android will be used to transmit signals. If HTC or LG - the corresponding SDKs will be used, if Undefined, then it is an “empty” implementation.

4. Initialize the signal sequences


 PatternConverter patternConverter = new PatternConverter(log); List<TransmitInfo> patterns = new ArrayList<>(); // Nikon D7100 v.1 patterns.add(patternConverter.createTransmitInfo(38400, 1, 105, 5, 1, 75, 1095, 20, 60, 20, 140, 15, 2500, 80, 1)); // Nikon D7100 v.2 patterns.add(patternConverter.createTransmitInfo(38400, 77, 1069, 16, 61, 16, 137, 16, 2427, 77, 1069, 16, 61, 16, 137, 16)); 

Using the PatternConverter class, we initialize the signals for the devices we want to control.

Important! The function “createTransmitInfo (int frequency, int ... pulseCountPattern)” receives the signal as a sequence of cycles (as in Android <= 4.4.2), and not microseconds (as in Android> = 4.4.3). To go from microseconds to cycles, each element of the pattern must be multiplied by the frequency and divided by one million (the number of microseconds per second). This is the inverse operation, to the one that converts given sequences under Android> = 4.4.3.

5. We transfer the sequence of signals to the device


 TransmitInfo transmitInfo = patterns[random.nextInt(patterns.length)]; infraRed.transmit(transmitInfo); 

In a specific example, we send specified signals in a random order.

Comments


The AndroidInfraRed project is an almost complete solution for remote control of devices using IR. The only thing missing is to test and adjust work with HTC, LG, Sony ... I really hope that there will be people on Habré with smartphones with IR that will help this project and those who will want to add IR work to their application in the future.

Result




The result was to create an application that covers all my needs for remote control of the camera and the process of photographing in general. It is a pity that the benefits of this application can be appreciated only by the owners of Samsung.

Source: https://habr.com/ru/post/257947/


All Articles