Recently, finally returned to work after the summer adventures and the first thing that my boss threw me was the link, about which now it will be discussed.
Since at the moment one of my projects is the recognition of gestures using the Kinect camera, analyzing them and performing actions on such a robot, then the link, as you can guess, is a similar topic.
So, meet, Kinect 3D Hand Tracking or, in our opinion, “Tracking the 3D position of the hand with the help of Kinect”. ')
What was before
Up to this point, I personally saw this only here, and personally, my demo on this link is so stable and did not work. In addition, if you move away from the sensor by more than 2 meters, it didn’t look at her fingers at all (in fact, my kinekt was already losing at a distance of a meter and a half, but maybe my fingers are wrong). True, everything was happening fairly quickly, even on my laptop, without using a graphics processor. And the code is open, since this is a demo project, which is now included in, I will not be afraid of this statement, the largest open source project for robotics is ROS . But then we will not talk about it. Imagine that it is interesting to us to very accurately track the movement of the fingers on the hand.
Who are all these people and what they wrote
Three researchers: Iason Oikonomidis - Nikolaos Kyriazis - Antonis Argyros from the faculty of computer science at the University of Crete wrote a demo that you can touch with your own hand, which follows the hand, including all 5 fingers on it. What exactly did this trio write? (The following list is a translation directly from the project page) Their software monitors the 3D position, orientation and full articulation of the human hand based on visual data, without using any markers. Method that was developed:
models the full articulation of the arm (26 degrees of freedom), performing any natural movements;
works based on melons received with an easily accessible and widely used RGB-D camera (Kinect, Xtion);
does not require markers, special gloves;
gives the result at a speed of 20 fps, true on the graphics processor.
does not require calibration;
does not use any proprietary technology to track a position, like Nite, OpenNI or Kinect SDK
Actually, after clarifying that it is better to wear the sleeves in order to make it easier for the program to keep track of your hand, they go directly to the demonstration video, which I will give further for those who are too lazy to switch to any links above.
Example 1
Example 2
Example 3
Example 4
Example 5
How fast it all works
And a little more about what hardware this whole thing should run.
CPU: Pentium® Dual-Core CPU T4300 @ 2.10GHz with 4096 MBs of RAM, GPU: GeForce GT 240M with 1024 MBs of RAM, Tracking FPS: 1.73792
CPU: Intel® Core (TM) 2 CPU 6600 @ 2.40GHz with 4096 MBs of RAM, GPU: GeForce 9600 GT with 1024 MBs of RAM, Tracking FPS: 2.15686
CPU: Intel® Core (TM) 2 Duo CPU T7500 @ 2.20GHz with 4096 MBs of RAM, GPU: Quadro FX 1600M with 256 MBs of RAM, Tracking FPS: 2.66695
CPU: Intel® Core (TM) i7 CPU 950 @ 3.07GHz with 6144 MBs of RAM, GPU: GeForce GTX 580 with 1536 MBs of RAM, Tracking FPS: 19.9447
Total
The system turned out to be slow, but so far it looks quite promising, although of course it still makes some mistakes. Personally, in order to do this, I am engaged, it is currently useless, since Atom is on the robot. So I will have to wait with the recognition of gestures with my fingers and stop at those for which you need to use the whole hand. Let's hope that the whole thing will be optimized as something else and sooner or later we will be able to enjoy controlling the robot or the computer, among other things, with precise hand gestures.
UPD: Direct link to download the actual demo. Binary under Windows x64 You can get acquainted with what should be installed here .
Ps. I would appreciate an indication of inaccuracies and is open to suggestions. Thank.