The study of eye movements - saccades and fixations - is one of the most interesting areas of analysis in neuroscience, including emotional issues. Indeed, the eyes are a relevant channel for collecting data on the current state and human responses to environmental stimuli, an important source of information about physiology, emotions, and cognitive aspects of life activity in natural, everyday conditions, in the context of various kinds of communications that occur between people. Without the data videooculography talk about multimodality in the recognition of emotions would be difficult.

Let's not forget that lighttracking is a serious market segment:
According to estimates by the influential agency Markets & Markets, the global market for actual tracking will reach $ 1.376.5 billion by 2023, with an annual growth of 27.4% and a stable forecast, shipments of finished units - eytreker will be about 756 thousand units / year. Of course, the numbers increase by several times when we consider light tracking in the application to large industries (for example, neuromarketing, AR / VR, the gaming industry, digital medicine, etc.).In general, in the industry of eytreking over the past few years, the phenomenon of a consistent reduction in the number of independent players has been observed. However, the largest manufacturers of oculographic systems - such as the Swedish company
Tobii and Canadian
SR Research (Eyelink) - successfully draw additional resources from external sources and strengthen their semi-monopoly positions.
In parallel, corporations are buying up companies and medium-sized startups. For example:
- Google acquired
Eyefluence ,
- Facebook -
EyeTribe ,
- and Apple in June 2017 - the German company
SMI with its branded technology to capture and record the view in real time with a sampling frequency of up to 120 Hz.
')
There are disturbing events. So, quite recently, the market value of Tobii has soared significantly in just 48 hours ... Analysts are guessing and putting forward conspiracy theories.
At
Neurodata Lab, we not only conduct regular monitoring of the market and develop our own software tracker, but also have accumulated considerable experience in working with third-party solutions. We will discuss them in more detail.
Nowadays, eytreking (otherwise known as video-culography) is a popular tool for studying human visual attention. Many psycho-physiological processes are reflected in the parameters of eye movements, in the dynamics of blinks and changes in the width of the pupil (fatigue, cognitive loading, emotional reactions, etc.). Now for practical purposes, eytracking is mainly used for usability research in neuromarketing. In addition, videooculography has found its use in gaming and assistive controllers for eye control (for example,
Tobii4C or the earlier model
TobiiEyeX ). On the basis of eytreking, systems for controlling the attention of drivers and dispatchers are actively developed (see the articles Sampei et al., 2016; Dongare, Shah, 2016; Anguliar et al., 2017), elements of a
“smart home” , or here
Eye of Horus is a project to create points for managing items.
Lab Aytrekera presented a limited number of brands (the most famous among them - EyeLink and Tobii) and not too accessible for wide use due to their transcendental value. Relatively low cost commercial eytrekery are GazePoint trackers mounted under the monitor (cost from $ 675), but they have a number of drawbacks: the small allowable range of movement of the subject's head - only 25x11x15cm - and quite “raw” software.
Taking into account the current situation on the market and the growing interest in video caching, it can be stated that there is a wide variety of hand-made solutions (hardware and software) for designing studies of oculomotor human behavior and data analysis, as well as developments in webcam tracking.
Aytreker from scrap materialsWith almost a complete list of open source software and trackers for self-assembly can be found
here . In addition to this, you should add
Open Eyes ,
PupilLabs and
PyGaze . By the way, PyGaze creator Edwin Dalmaijer (Edwin Dalmaijer) has published the book
“Python for experimental psychologists” with detailed guidance, we recommend to add it to your bookmarks.
Webcam based trackingAitracking solutions based on a regular webcam can be divided into two categories: online platforms (“sharpened” mainly for usability testing) and amateur or commercial SDKs.
Online platforms offer to create an account, create an experiment (for example, download a set of images) and send a link to the participants of the study. The subject is required to allow access to his webcam, to prepare for the experiment (remove glasses, remove bright light sources away from the camera, calibrate and move as little as possible). Obviously, it is impossible to control the behavior of the subject and the conditions with such an experimental design, so the accuracy varies and sometimes leaves much to be desired.
So, in the order listed:
-
EyesDecide (Xlabs) : a platform with an acceptable coarse localization of the gaze (provided that the subject does not move). There is a face detection (3D-model is built), calibration by 30 points, each of which is presented several times + additional calibration at the end of the test.
-
WebGazer : there is a face detection. Calibration is carried out by the subject himself by moving the cursor on the screen and fixing his gaze on it. The tracker can not be called accurate. In addition, when you look at one part of the screen and move the cursor to another part, the tracker, other things being equal, prefers to detect the position of the gaze on the cursor.
-
Eyezag : on this platform, you can put a brief experiment. Testing begins with calibration (16 points) and ends with it, but by 9 points. There is no tracking system for head movements on this platform, so the time of a possible experiment is limited to a few minutes and the usual request for the subject not to move. The results of the demo testing are sent on request. It is quite suitable for the task of approximate gaze localization with a large number of subjects and stream testing.
-
User zoom and
Sticky - two more platforms for usability testing using a webcam, but we haven’t yet managed to see the test results (User zoom - send us examples of our usability studies, but didn’t share the demo version of the software with us In the Sticky demo version, you can try to set the scope of the experiment, highlight areas of interest in the images, run it, but you cannot evaluate the final result in the demo version. The testing procedure begins with questions about the user's computer position, lighting, etc., after which calibration - in Anyway in the proposed demo version - not followed).
Freely available amateur projects and commercial SDKs do not work satisfactorily, but it is interesting to look at them. We mention some of them:
-
GazeRecorder : includes a face recognition system, calibration (from 5 to 17 points). Calibration on 17 points was processed for a rather long time (almost 2 minutes) and after 3-5 seconds it “slipped”.
-
TrackEye : a tracker based on a camera connected via USB 2.0., There is also an option to analyze the downloaded video. In addition to the main video, when tracking, several windows are launched that show the operation of the algorithm, they clearly show that the pupil is not being tracked correctly.
-
GazeTracker : there are settings for the detection (pupil, flare), video contrast and calibration (9, 12 and 16 points; it is allowed to adjust different speeds). Calibration does not adjust to the size of the monitor, despite the fact that in the options you can specify its resolution. The pupil detection algorithm is inaccurate, even if you pre-tinker with the settings: instead, it is sometimes recognized that something else is dark and round, resembling a pupil (for example, a fire alarm on the ceiling or nostrils from a certain angle). The tracker does not take into account the position of the head and "loses" the eyes with small turns.
-
SentiGazeSDK : does not take into account the position of the head, face detection is below average. During blinking, the SDK gives an error message notifying that it is impossible to detect a face. In addition, it does not work during sharp head turns.
-
InSightSDK (Sightcorp) - works with downloadable video. Detects a face in a video during frontal shooting, however, when cornering, an erroneous detection occurs (when loading a video, where the head of a person is initially turned to the side, it gives an error). Eye detection is also of poor quality (on downloaded video, 18 seconds long, where the subject was recorded frontally, 77.2% of data was lost on the X coordinate, 33.18% was lost on the Y coordinate).
Undoubtedly, such projects are much more than listed in our article. So far, such developments, of course, cannot replace or surpass laboratory Aytrekeri, but the problem is known - and an adequate solution (in terms of price-quality ratio) is not far off. This is a complex, interesting task - and market prospects that should not be neglected.
The analysis of eye movements on the most ordinary video requires, at a minimum, additional tracking of head movements, ideal face detection, and is complicated by the fact that the pupil occupies an extremely small area in the frame. All these nuances will undoubtedly be taken into account. Summarizing, we note that the creation of such a technology will allow thoroughly studying the behavior and creating a detailed “map of emotional reactions” of a person in usual conditions, especially in episodes of two- and multilateral communication, and it is unlikely to achieve this with the help of eytreker glasses.
Worked on the material:
Maria Konstantinova , Researcher at
Neurodata Lab , biologist, physiologist, specialist in visual sensory system, oculography and ocular motorics.
Literature:Aguilar WG, Estrella JI, Lopez W., Vanessa Abad V. Real-time eye gaze pattern analysis // International Conference on Intelligent Robotics and Applications. 2017. P. 683-694.
Dahnare H., Shah. 2016. V.4 (6). P.154-157.
Sampei K., Ogawa M., Torres CCC, Sato M., Miki N. Mental fatigue monitoring using a wearable transparent eye detection system // Micromachines. 2016. V.7 (2). P. 20.