
This article describes the technical details and problems that developers may encounter when creating software solutions that use the
Intel RealSense SDK for Windows * and Oculus Rift * glasses (headset). We begin with an overview of the Oculus Rift Development Kit 2 (DK2), then move on to some of the problems encountered when developing applications that work with several infrared cameras. This article also describes the integration of the Intel RealSense SDK and the Oculus Rift in the Unity * 5 project.
System requirements
The information presented in this article concerns the use of the Intel RealSense F200 (front view) camera. To perform all the described actions, you will need an
Oculus Rift DK2 kit, an
Intel RealSense camera
(F200) and a development system that meets the following requirements.
- Intel Core 4th generation or later (to support the Intel RealSense SDK)
- 150 MB of free hard disk space
- 4 GB of RAM
- One USB3 port for the F200 camera and two USB 2.0 ports for the Oculus Rift DK2
- Windows * 8.1
- A dedicated NVIDIA GTX * 600 Series or AMD Radeon * HD 7000 Series (or higher) graphics card with DVI * -D or HDMI * output.
The following software components are required to support the Oculus Rift * DK2 and Intel RealSense Camera (F200).
- Intel RealSense SDK (version 6.0.21.6598 or later)
- Intel RealSense Depth Camera Manager F200 (version 1.4.27.41944 or later)
- Oculus SDK for Windows * (version 0.7.0.0-beta or later)
- Oculus runtime for Windows * (version 0.7.0.0-beta or later)
- Oculus Utilities for Unity 5 (version 0.1.2.0-beta or later)
To get familiar with Unity, you can use the free personal version (5.2.2 or later) available
here .
Oculus Rift * DK2 Kit
Oculus Rift DK2 is a set of equipment and software components that allows developers to create games and virtual reality interfaces. In addition to the glasses (headset), the kit also includes a low-latency position tracking camera to monitor the movement of the user's head.
This is, in fact, an ordinary webcam, on the lens of which an infrared filter is installed. Glasses (headset) are equipped with several hidden infrared LEDs installed in such a way that the tracking camera uses them to determine the position of the user's head in three-dimensional space. Interestingly, these hidden infrared sources are detected by the Intel RealSense camera when viewing the camera's infrared stream in the Raw Streams example in the SDK.
')
Oculus Rift * IR LEDs are visible on the Intel RealSense cameraGlasses (headset) Oculus Rift include a gyroscope, accelerometer and magnetometer.
In combination with the combined sensor data, this equipment determines the orientation of the user's head and provides the corresponding rotation coordinates around the longitudinal, transverse and vertical axes. The tracking camera provides additional information on the position of the user's head (i.e., spatial coordinates along the X, Y and Z axes).
To better understand what the DK2 tracking camera adds to the picture of virtual reality, run the demo scene from the Oculus setup program.
Oculus setup programWhen viewing the demo scene with a tracking camera connected to the USB port, you will see that the objects on the virtual table will be closer or farther away when the head approaches or moves away from the camera.
Oculus Rift demo sceneIf you then start this demo scene by disconnecting the tracking camera's USB connector, you will see that the orientation data provided by the Oculus Rift glasses (headset) sensor will still set the rotation angles around the longitudinal, transverse and vertical axes, but the depth sensation disappears when moving the head along the z axis.
Interference with simultaneous use with front view camera
Developers who use the Intel RealSense SDK and are interested in building virtual reality applications that use the Intel RealSense F200 front-view camera should be aware of the potential for interference from the mutual influence of the 3D depth camera and the Oculus Rift tracking camera. The figure below shows the Lenovo ThinkPad * Yoga 15 ultrabook transformer with an integrated Intel RealSense F200 camera and an Oculus Rift tracking camera mounted side by side on the edge of the screen.
Front View CamerasThe Intel RealSense F200 Camera uses light encoding technology to project infrared radiation onto a user and capture the invisible reflected image with an IR camera. The Oculus Rift headset uses a set of IR LEDs for partial head tracking, the luminescence of which is recorded by its own passive IR camera; This camera is equipped with an optical filter that transmits only the infrared region of the spectrum. The effect of the IR LEDs of the Oculus Rift headset on an Intel RealSense camera can be seen as changing noise in the depth stream.
IR noise depth dataShifting the DK2 tracking camera to a certain angle relative to the Intel RealSense camera may slightly reduce the effects of IR interference, but the fact that the Oculus headset can be a source of IR radiation means that it will definitely affect the Intel RealSense front-view camera. when the DK2 tracking camera is used.
To better understand the effects of such interference, we launched the SDK Hands Viewer sample application by connecting the DK2 tracking camera to the USB port and running the Oculus Rift demo application. As shown in the figure below, there is a significant drop in personnel speed. This can be attributed to various factors (for example, the NVIDIA GeForce * 840M video adapter, less powerful than the minimum requirements, was used for this test). But it’s still interesting to see that hand tracking and gesture recognition with the Intel RealSense camera work quite well.
Gesture Recognition with Oculus Rift Infrared LEDsUsing Unity 5
Earlier, we mentioned possible interference between the Intel RealSense camera and the DK2 camera, but what happens if you combine these technologies in a real project?
In the following sections, we briefly analyze the creation of a simple Unity 5 project with support for an Intel RealSense camera, and then turn on virtual reality.
Create a new Unity project
Launch a new Unity project by double-clicking the Unity icon on your desktop. Select
New , then select the name and location of the project.
If the Unity editor is already open, create a new project by selecting
File ,
New Project in the menu, then specify the name and location of the project.
Import the RSSDK Unity Toolkit Toolkit
Import the RSSDK Unity Toolkit by selecting
Assets ,
Import Package ,
Custom Package ... from the menu.
On the
Import Package screen, go to the SDK folder in which the Unity Toolkit is located. This location may vary depending on where the SDK was installed. In this example, the toolbox is located in the folder
C: \ Program Files (x86) \ Intel \ RSSDK \ framework \ Unity .
Select UnityToolkit, then click
Open .
(Note. This folder contains two Unity files:
UnityToolkit and
UnityCSharp . When importing
UnityCSharp , only the necessary managed and unmanaged DLLs needed to support the Intel RealSense SDK in the Unity application will be added. If you import the UnityToolkit, the necessary DLLs will be imported into a project along with many other resources to streamline the development of a project with support for the Intel RealSense SDK.)
The
Importing Package screen
appears , in which all plug-ins, actions and presets are selected. Leave the checkboxes selected and click the Import button.
On the
Project screen
, note that a number of assets have been added to the project, which are located in the following folders.
- Plug-ins Contains the libpxccpp2c.dll file and the unmanaged C ++ P / Invoke DLL.
- Plugins.Managed . Contains the libpxcclr.unity.dll file and the managed C # interface library.
- RSUnityToolkit . Contains folders Actions, Internals, Prefabs and Samples .
Add game object
The project will initially contain a main camera and a directional light source. On the
Hierarchy screen, click
Create , then select
3D Object and
Cube . At the same time, a game object-cube will be added to the game scene.
In the
Project folder in
Assets, expand the
RSUnityToolkit folder and select Actions.
The Actions folder contains scripts that can be applied to game objects. Click the
TrackingAction script, then drag it onto the
Cube game object on the
Hierarchy screen.
Select
Cube on the
Hierarchy screen, and you will see that the
Tracking Action (Script) appears on the
Inspector screen.
Hand tracking setup
The default for the tracking action is
HandTracking , and for the dimensions of the virtual world area (Virtual World Box Dimensions) is set to 100 along the X, Y, and Z axes. If you are playing a game at this point, you will see that the 3D camera depth LED will turn on, indicating that the camera is activated.
If you raise your hand in front of the camera, you (most likely) will see that the game object-cube will fly away from the screen. The reason is that the
Virtual World Box Dimensions parameter is set too large. Change the
Virtual World Box Dimensions settings to 10 for the X, Y, and Z axes.
Notice that in the scene view, the virtual world field, outlined in red, is now close to the game object.
Run the game again. Now the game object-cube should track the movement of your hand in a closer virtual space.
You may have noticed that the cube moves intermittently. You can make the movement smoother by setting the
Smoothing Factor parameter to 10.
Unity * Editor - tracking action parametersTurning on virtual reality in a Unity project
As stated on
the Oculus Rift DK2 website , Unity 5.1+ developers can replace the stereoscopic virtual reality camera with built-in means of tracking the orientation and position of the main camera. To do this, check the
Virtual Reality Supported checkbox in the
Player Settings section. Follow these steps to enable virtual reality in your project.
Select
Edit - Project Settings from the menu, then click
Player .
In the
Inspector window, select the checkbox
Virtual Reality Supported .
Click the
Play button and put on your Oculus Rift glasses. You will immediately see that the transformation of the main camera is now replaced by tracking the orientation of the Oculus Rift glasses. The cube still tracks the movement of your hand using the
TrackingAction script in the Intel RealSense SDK Toolkit for Unity, but the Unity camera now tracks the movement of glasses.
Unity with virtual reality enabledImport the Oculus Utilities package for Unity
The Utilities package is an optional addition, it includes workpieces, scenes and scenes for developing virtual reality applications. The following shows how to import this package into your project.
Choose
Assets, Import Package, Custom Package ... from the menu.
On the
Import Package screen, navigate to the folder where the OculusUtilities Unity package file is located. Select
OculusUtilities , then click
Open .
The
Importing Package screen
appears , where all Oculus components will be selected. Leave all checkboxes selected and click the
Import button.
Note. In this example, we will add the
OVRCameraRig from the OculusUtilities package to the scene. Therefore, there is no need to check the
Virtual Reality Supported box in the Inspector window when using the OculusUtilities package.
Drag the
OVRCameraRig to the stage.
Turn off the main camera on stage to make sure that only
OVRCameraRig is used .
Click the Play button and put on your Oculus Rift glasses. You will see that
OVRCameraRig now tracks the movement of points, and the transformation of the cube is still controlled by the movement of the hand.
Note. For complete details on using the Oculus Utilities package, see the
documentation .
Add Virtual Reality to the Intel RealSense SDK Toolkit Sample Project for Unity *
With built-in virtual reality support in Unity 5.1+, you can very easily enable virtual reality support in the sample Intel RealSense SDK Toolkit for Unity, for example, in Sample 1 - Translation. Try the following.
In the
Project folder
, expand
RSUnityToolkit -
Samples -
Scenes and double-click
Sample 1 - Translation (remember to save your previous work first if you need it).
Select
Edit -
Project Settings from the menu, then click
Player .
Make sure that the
Virtual Reality Supported check box is
selected in the
Inspector window.
Click
Play and try!
Conclusion
In this article, we looked at the potential interference problems that may occur when using an Intel RealSense front-view camera in a project using the Oculus Rift tracking camera. We also talked about simple scenarios in which the Intel RealSense camera hand tracking was used in virtual reality. Try it yourself and see what interesting results you can achieve!