📜 ⬆️ ⬇️

Ay-tracker ET-1000 from The EyeTribe

Terminology: tracing paper from English Aytreker / Haze tracker does not sound very, but the translation “device for tracking eye movements” and “device for tracking eyesight” is very cumbersome. Not sure that there is any more convenient established terminology in Russian - let me know if I am mistaken

image

In recent years, Kinect (limbs, body as a whole), Siri (voice), and LeapMotion (fingers) have shown that contactless control and data / text input can be very convenient and intuitive for certain tasks. But until recently, few knew about the possibility of using the look for the same tasks. And there were two main reasons: a) all devices were quite expensive (thousands and tens of thousands of euros / dollars), b) and the accuracy of determining the direction of sight was frankly lame. And if the accuracy in the top products is already quite high (about half a degree - it still needs some work, although it will never reach the same level as the accuracy of the mouse cursor positioning due to the biological peculiarity of the eye), then the price in an old car made these devices are very niche: the market was divided into marketing (usability research, there were articles on this topic on Habré), academic (versatile university research), medical (patient research), and “accessibility” (using i-trackers anyway). with disabilities as a means of control).

And a few years ago a group of Ph.D. Students from the Copenhagen Technical University set out to expand the “accessibility” niche to any user of computing devices. Their first i-tracker was a free program (ITU GazeTracker), which calculated the direction of view from a video stream of a regular webcam; however, it was necessary to install an infrared LED yourself, or buy cameras with one (the choice was very modest). The program worked, although the accuracy was not acceptable for everyone and only with the case of high immobility (it was used by some paralyzed in Denmark), however, this concerned all free / open-source ai-trackers because conventional home-made cameras were their main bottleneck.
')
The creators of ITU GazeTracker and the founders of The EyeTribe
The creators of ITU GazeTracker (except the leftmost one) and the founders of The EyeTribe

However, Javier San Agustin Martin Tall and Henrik Skovsgaard did not stop at this, and in collaboration with Sune Alstrup Johansen created the start-up The EyeTribe which received more than $ 3M of investment (including from Intel). The goal of a startup is to create an affordable product with a quality close to those sold for $ 20,000, namely for $ 99 . Having opened the pre-order in September and having sent the first batch of devices in December-January, the developers stated that the next batch will be ready by the end of March, but in general there is no release from those who want it.

ET-1000 by The EyeTribe
ET-1000 by The EyeTribe

It should be noted here that companies that are dominant in the production of trackers, such as Tobii Technologies and SensoMotoric Instruments, have already felt where the wind is blowing: for example, Tobii Technologies two years ago released Tobii REX “exclusively for developers” (i.e. do research or use in business) for $ 1000. But, apparently, the product was not very popular and was closed and the balance sold for a third of the price . Now the same company, having realized it, is trying to hold on to the sharp rise of The EyeTribe by intensively developing Tobii EyeX for $ 95 / € 70 (with the promise to send by the end of this month), while at the same time trying itself in the new quality of the patent troll . Most likely, the license will remain the same as that of Tobii REX - why change, if you can get such good money from rich Western universities ...

So, a month ago this was received:

ET-1000
ET-1000

Very small - only 20 x 2 x 2 cm, less than similar devices from other manufacturers. In the box there is a small tripod and a very hard and thick USB 3.0 cable. By the way, USB 3.0 is needed, it will not work on the second version. Nothing more, but you can download installation software from the official site with the server and the SDK. And also nothing more, i.e. There are no ready-made computer management programs (there are some demos on Github , but this is only as a tutorial for developers), only in the GUI version of the server is it possible to link the cursor to the detected viewpoint. The developers write that the creation of such programs is left to others, and they themselves are engaged exclusively in hard and soft to determine the point of view. I am one of them (from the others), and the share of PR is slightly lower.

Testing this device (running on Win7 +, promised support for Andoid, Mac and Linux) had an extremely positive impression - I expected less from what it cost as one hundredth to the Tobii X2-60. After calibration at 9 points (this is not as convenient as that of Tobii with 5 points, but less so far is impossible) the accuracy was about 1 degree (40-50 pixels - slightly worse than top brands), but with a slight adjustment by head movement it was very easy to get into the “close / minimize” window buttons in Win7 / 8. For a long time I did not test, but for those 3-4 minutes the accuracy did not deteriorate, the eyes and pupil remained detectable almost all the time. In another place with different illumination, the eyes were sometimes “lost” for a short time, but the recovery time was short (offhand, 100–200 ms). In general, this is a very decent result for my money, and in my opinion, in terms of accuracy and reliability, it is only slightly worse than Tobii products and very close to those SMI products with which I happened to work. Formal testing for accuracy has not yet been conducted, perhaps this is the topic for the next article.

6 infrared LEDs on each side
6 infrared LEDs on each side

Now about the SDK. It, like the data server, is still very raw (not all of the claimed functionality works), but is being actively developed (at the beginning of February, version 9.20, at the end - 9.27). The interaction process takes place through a socket and looks like this:
0. TET server is started (short for The EyeTribe).
1. The client connects to 127.0.0.1:6555/tcp
2. The client sends requests in JSON format, the client responds with the same format. The content of the package is painted here .
3. In addition, the client sends a “heartbeat” packet for 3 seconds - they say I'm here, alive (why this is a TCP mystery for a TCP connection).
4. The client draws calibration points himself, only notifying the server when it is necessary to start and finish processing the data to compile the calibration matrix.
5. In the push mode, the client receives viewpoints without requests, 30 points per second (there is still a mode of 60 Hz, but I have not checked it yet).
5a. each dot contains:
- coordinates of the point of view: averaged and for each eye separately.
- coordinates of the point of view after smoothing by some low-pass filter: averaged and for each eye separately.
- pupil size: for each eye separately (I do not know which units of measurement are used yet).
- the coordinates of the pupils in the field of view of the camera.
- fix flag
- server status code.

On GitHab already eats some amount of code for C ++ and C # developers so that I don’t have to write everything from scratch. However, there is a possibility that the packet format and data types will slightly change in the future, thus making old programs for this device incompatible with new versions of the server. For example, I had to find out that the official description of the packages at the link above had two inaccuracies in data types, and some packages may contain values ​​that are not described there,

On this one could wish for successful development and development of new markets, but in conclusion I want to share one problem. It consists in the fact that there is no single driver or interface for I-trackers that would make them as compatible with each other as devices like computer mice. Those. Once you have developed a software for a specific IT tracker, you will not be able to connect another one to it, you will have to write a new module. About 12 years ago I attended to this issue and wrote a certain platform (middleware) with a single interface for the client side. Since the next I-tracker comes into my hands, I write a separate module for it (DLL), and my whole software designed to use the look on top of this platform immediately begins to work with this device without unnecessary gestures. Naturally, the ETU-Driver (as I called it) already supports the ET-1000 (in the near future, an accuracy comparison with the Tobii T60 and X2-60). But over the years of use, its shortcomings came to light: first of all, the hard format of the data to be sent and implementation in the form of COM. There was already an attempt to rewrite everything from the beginning , but the feeling that the implementation should be some other, possibly more flexible and platform-independent, does not leave. If anyone has any thoughts on how to properly implement a device-independent platform for such devices - a big request to share.

Source: https://habr.com/ru/post/214503/


All Articles