Have you ever thought about the fact that thousands of companies create their products and technologies in isolation from user needs and solve their own invented problems that are weakly correlated with the real ones?
As part of creating one of our technologies, we, the Macroscop developers, belonged to such companies: for 6 years we worked on a function that, in our opinion, should have made the lives of thousands of people easier and more convenient.
In 2008, we had an idea to make the search process in the archives of video systems as simple as possible. Imagine a medium-sized system of 100 video cameras and approximately 1000 hours of videotapes they left for a day (usually video is recorded only when there is activity in the frame). And you need to find something in these records, but you do not know where and when it happened. You will look through the records for an hour, second, third, and by the time you find what you need, you curse everything.
')
We decided to create a tool that allows you to search by video in the same way that Google searches by text:
ask a person in the system with some signs, for example, dressed in a green T-shirt and black jeans, and get everyone who meets these parameters.
We created such a tool and called it an
indexer (object indexing technology). The solution works with the color combinations of the sample for the search: the object is clustered (the parts of the same color are distinguished), for each cluster its characteristics are determined, which, as a result, form an index. Similarly, the indices are calculated for all objects in the archive, and by comparing the program offers the operator a set of results - all objects whose indices are close to the sample index.
So in the indexer looks like the formation of a search pattern: the operator can manually paint a human figure in the appropriate colors.
Indexer search results are given as a set of images. The operator selects the desired option, and then views the video segment from the archive.The indexer has become a brainchild for us, on which we are to a certain extent obsessed. We were absorbed in this idea and spent a tremendous amount of effort, time and money on development. So we hired 2 teams of high-class developers - a team of "physicists", graduates of the Faculty of Physics, and a team of "mathematicians" - graduates of Mekhmat, who for a whole year "competed" by solving indexing problems using different methods. By the way, the “physicists” won this competition :)
We carried out a lot of funny experiments, during which we checked how our indexer responds to a particular light, clothes of one or another color. For example, in that year, walking along the corridor of the business center in which we worked, you could easily meet men in colorful family underpants on top of suits or jeans. These were our developers who tested how the indexer responds to different patterns and textures.
Work on the development of the indexer continued until 2014. We have made significant progress in development and created a really working tool, but the task of recognizing color combinations is very complex, so even after 6 years the quality of the search for signs was not perfect. At the same time, the indexer and the interactive search module created on its basis were available to users: it was sold as a plug-in or was provided free of charge as part of the maximum software version. Periodically, we released updates, within which something was improved, but something was “falling”. Often it was an indexer, but almost no one ever contacted the company with the problem that the search for signs does not work. And at some point we realized that such inquiries and complaints do not come due to the fact that just no one uses them and EVEN DOES NOT ATTEMPT TO USE. And we are engaged in abstract developments, the implementation of ideas completely divorced from reality.
In 2014, we recognized that our idea of ​​searching for signs failed, it was impossible to move on in the same way. We decided to
make a U-turn .
The plan was: to talk closely with the 50 real users of video surveillance systems and find out from them what they are looking for, how they are looking for, what they need; to understand whether they need a search or for them the work of some intellectual functions in real time is important.
We started dating and chatting. During one such meeting, we were told: “Your search for signs is theoretically interesting, but in practice we often need not just to find a person, but to understand how he moved around the object: where he came from, where he was, when and where he left.” Soon the same need was independently expressed by another 5 or 7 users.
We thought about how this problem can be solved within our software and provide users with data on human movements in the cameras of the entire video surveillance system. We have started the development of
inter-camera tracking .
Inter-camera tracking allows you to track the movement of objects (in the current implementation of people) in the field of view of all cameras and get the trajectory of this movement. That is, to understand where any interested person came from, where he went and how he moved within the framework of the video surveillance system. The technology of inter-camera tracking is based on the same indexer and search by signs. The user selects any person in the frame and sets it as a sample for searching on other cameras. Macroscop searches for all visually similar objects on the closest cameras in adjacent time intervals. The user only needs to step by step confirm the right person in the results.
The result of the inter-camera tracking is the path of movement of a person on the object's plan, a video from fragments of movement from different cameras or image slideshows that allow you to restore the full picture of his actions: what time appeared on the object, where he was when he left the object.
At any convenient opportunity, we continue to validate this idea. For example, at the Mips / Securika exhibition last month ago, we presented intercameral tracking and surveyed more than a dozen experts from leading companies in the industry about its usefulness. And this is what deep and detailed interviews have shown:

Our many years of experience in developing an indexer confirmed well-known truths about which they write in books:
1. Generation of ideas in isolation from reality and their subsequent selfless implementation is a very risky occupation. Invented something - ask 10, 50, 100 users. And it is better not to engage in abstract inventing, but reveal the real sore point.
2. They felt the need, began to implement the solution - start with the prototypes. And constantly test and test your development on all the same real users. The closer the relationship of developers with the real world, the higher the chances of not wasting all the effort, money and time to implement the idea in vain.
3. If people do not criticize your product, most likely, they simply do not use it.
4. And finally, the main thing is to recognize in time that your idea has failed, and to be able to admit to yourself and your team in this. Do not be afraid to make turns, analyze past experience and be prepared for the fact that a new idea may fail. But sooner or later, trying and making mistakes, still come to success.
We do not know whether inter-camera tracking in practice will become a useful and sought-after feature of our product. Therefore, we treat its development not as fanatically as previously with the development of an indexer. Nevertheless, a lot has been done by the developers of the new feature, and the inter-camera tracking technology already exists in the Macroscop release and is available for use both in the full version and in the demo version:
macroscop.com/download.html