
Valve, which released the Vive virtual reality system this year with HTC, fulfilled its promise last year and opened a
program that allows third-party developers to create devices compatible with their 3D Lighthouse tracking technology. Participation in the program will cost almost $ 3000 and implies
mandatory personal attendance of training courses to be held in September at the Synaptic office in Seattle. During the course, the developers will not only receive detailed instructions on all aspects of development (in the areas of industrial design, circuit design and software), but also get a devkit with components to build their prototype.
With the exception of the entry fee, developers will be able to use the system freely, without any deductions to Valve. ASIC chips that perform signal processing are produced by Valve's partner, Triad Semiconductor, and judging by the official
FAQ , are available for ordering.
')

The development kit includes:
- Ready module with INS for attachment to devices and accessories
- A set of embedded circuit boards for creating new devices
- 40 optical sensors (up to 32 are connected to one module at the same time)
- Some accessories for building prototypes (probably like a controller grip, a rifle layout, etc.)
- Set of two lighthouse base stations
- Software for calibration and positioning of sensors
- Technical documentation

In short, the Lighthouse technology works as follows: two fixed base stations emit a pair of perpendicular scanning IR rays shared by a sync pulse using an optical system. The monitored device with the help of photodiodes measures the time between the sync pulse and the arrival of the scanning beam, and determines the angles relative to each of the base stations from a known rotation frequency (120 Hz). Next, the triangulation method is used to determine the relative position of the base stations and the monitored device. This approach has advantages over the “constellation LED + camera” system, used in particular in the Oculus Rift: the vive base stations are passive and do not require a communication line with a PC, and there is no limit on the number of simultaneously monitored sensors. In addition, the spatial resolution of the video cameras is replaced by the temporal resolution of the pulses from the photodiode, which allows to achieve the resulting accuracy of determining the coordinates of the order of 1 mm. Information from optical sensors is coordinated with the built-in INS to improve short-term accuracy, response time and as a reserve at the moments when the device loses the direct visibility of stations.
Valve hints that the system can be used not only for VR accessories, but, for example, for indoor multikopter navigation. Imagination draws vast spaces for startups in the field of 3D scanning, augmented reality, interactive entertainment, robotics, etc. In addition, by extrapolating this trend, I predict that we will soon see a full-fledged tracking of the whole body *, and a new round of development will be dubbed VR 2.0 (by analogy with the web). See you in the matrix.