📜 ⬆️ ⬇️

Android screen stabilization

image

Did you try to read a book or an article like this on a bus or walking down the street? I tried to argue! In this case, you should have noticed that reading the text in this way is not the best idea because of the constant shaking. It seems that shaking the screen is quite a serious problem and eliminating it can give a very good improvement in UX. My idea is to use acceleration sensors to compensate for shaking as well as SLR cameras stabilize the sensor or lenses. Technically, this is possible so why not try it yourself!

Existing solutions


First, let's look at existing solutions. There are several interesting articles on the web on the same subject.

  1. NoShake: Content Stabilization for Shaking Screens of Mobile Devices by Lin Zhong, Ahmad Rahmati and Clayton Shepard on iPhone Screen Stabilization (3) published in 2009. The article sums up that screen stabilization works and gives noticeable results, but the algorithm consumes “an average of 30 % power at 620 MHz ARM processor ”. This makes this implementation impractical for real use. And although modern iPhones can easily cope with this task, the authors did not provide either the source code or the assembled application so that you can try it in action.
    ')
  2. Walking with your Smartphone: Stabilizing Screen Content by Kevin Jeisy. This article was published in 2014 and has a good mathematical justification. The article sums up that “using the hidden Markov model we got a good stabilization in theory”. Unfortunately, neither source codes, nor the assembled application are provided, so it will not work.

  3. Shake-Free Screen . The same question is being investigated, but there are no ready-made results to try.

These articles give a good explanation of the topic of our article, but unfortunately they do not give either source codes or compiled applications to look at it live. Let's try to reinvent the wheel and realize the stabilization of the screen in its own way.

Theory


The acceleration sensor can be used to determine the movement of the device. But judging by the name of this sensor is still designed to determine the acceleration. To answer the question "how to determine the movement with acceleration," let's look at the device with sensors:
image
As you can see there are three axes, respectively, the sensor gives three values ​​at the output. Technically, the sensor consists of three sensors located along different axes, but let's take it as a whole.

Three output values ​​indicate acceleration along the corresponding axis:

image

Acceleration is measured in “m / s2”. As you can see there is some acceleration along the Y axis. In fact, this is the acceleration of gravity and any rotation of the device will change all three values:

image

You can imagine it as a ball tied to a device with a rope. This is a pretty good explanation because if you replace the ball with an arrow, you get an acceleration vector.

OK, but what about the definition of displacement?


I cannot show some illustrative example, but if you move the device a little, the vector will change: in fact, it will consist of two vectors: 1) the vector of gravity as before; 2) the acceleration vector of the device due to movement along the respective axes. The most interesting thing for us is the “pure” displacement vector. It is easy enough to get it by subtracting the vector of gravity from the resulting vector, but how to determine the true vector of gravity? This task can be solved in different ways, but fortunately, Android has a special linear acceleration sensor that does exactly what we need. Under normal conditions, the output values ​​are at sensor 0, and only by moving the device you can get non-zero values. Here is its source code if interested. We are one step closer to determining the movement of a device. Let's start programming something.

Implementation


To find out how to calculate the movement of a device, let's develop one simple application with one activation. This application will monitor the change in acceleration and move the special view item accordingly. It will also show raw acceleration values ​​on a graph:

image

I will show only key code examples. Completely all the code is in the GIT repository. The key things are:

1. A special element that we will move. This is a blue block with text inside the container:


<FrameLayout android:layout_width="match_parent" android:layout_height="match_parent" android:layout_above="@id/graph1" android:background="@drawable/dots_repeat_bg" android:clipChildren="false"> <LinearLayout android:id="@+id/layout_sensor" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_margin="20dp" android:orientation="vertical" android:background="#5050FF"> <ImageView android:id="@+id/img_test" android:layout_width="wrap_content" android:layout_height="wrap_content" android:src="@mipmap/ic_launcher"/> <TextView android:id="@+id/txt_test" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_below="@id/img_test" android:textSize="15sp" android:text="@string/test"/> </LinearLayout> </FrameLayout> 

To move the layout_sensor, we will use the View.setTranslationX and View.setTranslationY methods .

We will also subscribe to the event of clicking on any element to reset the internal values ​​to 0 because at first they can be very naughty:

 private void reset() { position[0] = position[1] = position[2] = 0; velocity[0] = velocity[1] = velocity[2] = 0; timestamp = 0; layoutSensor.setTranslationX(0); layoutSensor.setTranslationY(0); } 

2. Sign up for acceleration sensor events:


 sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE); accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION); sensorManager.registerListener(sensorEventListener, accelerometer, SensorManager.SENSOR_DELAY_FASTEST); 

3. And most importantly: the change listener. Its basic implementation is:


 private final float[] velocity = new float[3]; private final float[] position = new float[3]; private long timestamp = 0; private final SensorEventListener sensorEventListener = new SensorEventListener() { @Override public void onAccuracyChanged(Sensor sensor, int accuracy) {} @Override public void onSensorChanged(SensorEvent event) { if (timestamp != 0) { float dt = (event.timestamp - timestamp) * Constants.NS2S; for(int index = 0; index < 3; ++index) { velocity[index] += event.values[index] * dt; position[index] += velocity[index] * dt * 10000; } } else { velocity[0] = velocity[1] = velocity[2] = 0f; position[0] = position[1] = position[2] = 0f; } } }; 

Let's see what's going on here. The onSensorChanged method is called each time the acceleration value changes (note of the translator: well, in fact, it is called by a timer, regardless of what acceleration values ​​are). First you check whether the timestamp variable is initialized. In this case, we simply initialize the main variables. If the method is called again, we perform calculations using the following formula:

 deltaT = time() - lastTime; velocity += acceleration * deltaT; position += velocity * deltaT; lastTime = time(); 

You should have noticed an interesting constant of 10,000. Think of it as a kind of magic number.

And the result:



As you can see, the current implementation has two problems:


In fact, the solution for both problems is common - you need to enter braking in the formula. The modified formula looks like this:

 deltaT = time() - lastTime; velocity += acceleration * deltaT - VEL_FRICTION * velocity; position += velocity * deltaT - POS_FRICTION * position; lastTime = time(); 



Good. The current implementation looks good. I would add some cosmetic enhancements such as a low-pass filter for smoothing, cutting off invalid values ​​and program settings.

The finished application is located in the repository in the “standalone_app” branch.

AOSP


We have developed a basic stabilization algorithm and made a demo application which shows that screen stabilization is possible. Now we can apply our work to the device as a whole. This is not an easy task, but it will be more interesting to solve it.

This task requires some experience in building AOSPs . Google provides all the necessary documentation . In general, you need to download the Android source code for the selected Nexus device. Collect firmware for Nexus and flash it. Do not forget to include all the necessary drivers before building.

As soon as you manage to collect the stock firmware, you can begin to develop and integrate screen stabilization.

The implementation plan is as follows:


  1. Find a way to shift the screen in the device
  2. Develop an API in the insides of AOSP to give the ability to set the offset in a standard Android application
  3. Develop a service in the demo application that will process data from the acceleration sensor and set the offset using the API above. The service will start automatically when the device is turned on so that stabilization will work immediately after switching on.

Now I just tell you how I solved these problems.


1. The first file for the study DisplayDevice.cpp which controls the parameters of the screen. Method to look at void DisplayDevice :: setProjection (int orientation, const Rect & newViewport, const Rect & newFrame). The most interesting is in line 483:

image

where the final transformation matrix is ​​formed from other components. All of these variables are instances of the Transform class. This class is designed to handle transformations and has several overloaded operators (for example *). To add a shift, add a new element:

image

If you compile and flash your device, the screen there will be shifted by translateX pixels horizontally and translateY pixels vertically. Finally, we need to add a new method void setTranslate (int x, int y); which will be responsible for the shift matrix.

2. The second interesting file is SurfaceFlinger.cpp . This file is key in creating an API for accessing screen parameters. Just add a new method:

image

which will call the setTranslate method for all displays. The other part looks a bit strange, but I'll explain it later. We need to modify the status_t method of SurfaceFlinger :: onTransact (uint32_t code, const Parcel & data, Parcel * reply, uint32_t flags) by adding a new section to the switch construction:

image

This code is the entry point to our improvement.

3. The data processing service is quite simple: it uses the algorithm developed earlier to obtain the offset values. Further, these values ​​are transmitted via IPC to SurfaceFlinger:

image

ServiceManager is not recognized by Android Studio because it is not available for non-system applications. System applications should be bundled with AOSP using a makefile build system. This will allow our application to get the necessary access rights in the hidden Android API. To access the SurfaceFlinger service, the application must have the rights “android.permission.ACCESS_SURFACE_FLINGER”. Only system applications can have these rights (see below). In order to have the right to call our API with code 2020, the application must have the “android.permission.HARDWARE_TEST” rights. Only system applications can have these rights. And that in the end to make our application system, modify its manifest as follows:

image

Also create the corresponding makefile:

image

The rest of the things in the application (broadcast receiver downloads, settings, other) are fairly standard and I will not be concerned here. It remains to show how to make this application preinstalled (i.e., embedded in the firmware). Simply place the source code in the {aosp} / packages / apps directory and modify the core.mk file so that it includes our application:

image

Final Demonstration:



You can find detailed information and source code on github



There is a ScreenStabilization application that should be placed in the {aosp} / packages / apps directory, AOSP patch files: 0001-ScreenStabilization-application-added.patch should be applied to the {aosp} / build directory, 0001-Translate-methods-added .patch must be applied to the {aosp} / frameworks / native directory.

The firmware for Nexus 2013 Mobile is built in the “ userdebug ” configuration so that it is more suitable for testing. To flash the firmware, boot into bootloader mode by holding the “volume down” button and pressing the “power” button at the same time. Next enter:

 fastboot -w update aosp_deb_screen_stabilization.zip 

This procedure will delete all existing data on your device. Keep in mind that in order to flash any non-standard firmware you must unlock the bootloader with the command:

 fastboot oem unlock 

Conclusion


This article shows how to implement a simple screen stabilization algorithm and apply it to the entire device by modifying the Android source codes and building custom firmware. The algorithm is not perfect but sufficient for demonstration purposes. We have created a modified firmware for the Nexus 2013 Mobile device, but our source code can be applied to any Nexus device and even to any AOSP system like CyanogenMod, which makes it possible to integrate screen stabilization into new devices.

PS In fact, I am also the author of the original English version of the article, which was published on blog.lemberg.co.uk , so I can answer technical questions.

Source: https://habr.com/ru/post/317462/


All Articles