📜 ⬆️ ⬇️

Fingerprint tracking with Microsoft Kinect

Recently, finally returned to work after the summer adventures and the first thing that my boss threw me was the link, about which now it will be discussed.

Since at the moment one of my projects is the recognition of gestures using the Kinect camera, analyzing them and performing actions on such a robot, then the link, as you can guess, is a similar topic.

So, meet, Kinect 3D Hand Tracking or, in our opinion, “Tracking the 3D position of the hand with the help of Kinect”.
')


What was before

Up to this point, I personally saw this only here, and personally, my demo on this link is so stable and did not work.
In addition, if you move away from the sensor by more than 2 meters, it didn’t look at her fingers at all (in fact, my kinekt was already losing at a distance of a meter and a half, but maybe my fingers are wrong). True, everything was happening fairly quickly, even on my laptop, without using a graphics processor. And the code is open, since this is a demo project, which is now included in, I will not be afraid of this statement, the largest open source project for robotics is ROS .
But then we will not talk about it. Imagine that it is interesting to us to very accurately track the movement of the fingers on the hand.

Who are all these people and what they wrote

Three researchers: Iason Oikonomidis - Nikolaos Kyriazis - Antonis Argyros from the faculty of computer science at the University of Crete wrote a demo that you can touch with your own hand, which follows the hand, including all 5 fingers on it.
What exactly did this trio write?
(The following list is a translation directly from the project page)
Their software monitors the 3D position, orientation and full articulation of the human hand based on visual data, without using any markers. Method that was developed:


Actually, after clarifying that it is better to wear the sleeves in order to make it easier for the program to keep track of your hand, they go directly to the demonstration video, which I will give further for those who are too lazy to switch to any links above.
Example 1
Example 2
Example 3
Example 4
Example 5

How fast it all works

And a little more about what hardware this whole thing should run.

Total

The system turned out to be slow, but so far it looks quite promising, although of course it still makes some mistakes. Personally, in order to do this, I am engaged, it is currently useless, since Atom is on the robot. So I will have to wait with the recognition of gestures with my fingers and stop at those for which you need to use the whole hand.
Let's hope that the whole thing will be optimized as something else and sooner or later we will be able to enjoy controlling the robot or the computer, among other things, with precise hand gestures.

UPD:
Direct link to download the actual demo. Binary under Windows x64
You can get acquainted with what should be installed here .

Ps. I would appreciate an indication of inaccuracies and is open to suggestions. Thank.

Source: https://habr.com/ru/post/151258/


All Articles