📜 ⬆️ ⬇️

KinectFusion - build a 3D scene in real time



Microsoft Research at SIGGRAPH demonstrated a very interesting development - KinectFusion . The software allows you to restore the 3D scene in real time based on data from Kinect, as well as perform segmentation and tracking of objects.

The technology was impressive, I think that now the games are becoming real, in which it will be possible to transfer objects and environments from reality. By the way, you can and vice versa, now there is a boom in the development of 3D printing technology, it is quite possible that it will soon be available. Having such accessible scanning and printing, we get the possibility of electronic transmission of real objects. But this is of course only one of the uses.
')
Under the cut a small video analysis:


Building a 3D model (triangular grid)

They note that Kinect can be moved in space rather carelessly, jitter and sharp movements are not terrible, in real time the current 3D cloud of points coincides with the existing scene and the scene is modified / completed. At this stage, only the depth data is needed. The triangulation is somewhat blurred, because the grid of rays is 640x480, so you need to come closer to clarify the details. I think in the future they will be able to increase the resolution and then the device will become much more serious. Another option is to install optics so that you can scan objects in detail, now the minimum scan distance is quite large - 1.2 m.





Texturing a Model

On Kinect there is an ordinary camera, from there a color is taken and textures are built. In the picture to the right is a textured model, around which a light source flies.



Augmented Reality - throwing balls into the scene

We throw a bunch of balls into the scene, on the real-time GPU, they are scattered around the triangulated 3D scene, and on a regular video from Kinect we draw these very balls, taking into account the clipping of invisible sections.



Augmented Reality - we throw balls into the scene, the scene is changeable

Again, we throw balls onto the stage, the person in the frame shakes off the towel and the balls interact with the changing 3D scene.



Segmentation - an item is stored that is removed.
On the table is a kettle, the scene is static. We take the teapot and remove it from the scene, it is recognized as a separate object.





Tracking - we track the selected object

Then we return the kettle to its place, the program realizes that it is back, then we move it and the software tracks it moving, combining it with the prototype.





Segmentation and tracking - we draw fingers on objects

The hand stands out against the background of a static scene, it is tracked and determined by multi-touch touching the surfaces of objects.




Source: https://habr.com/ru/post/126379/


All Articles