Autonomous robot of the NAMT team at Robokross-2013 and Eurathlon 2013
Good afternoon! I want to publish a report on the autonomous robot of the NAMT team that participated in the Robocross 2013 competition and the European Eurathlon 2013 . This time, it was not an electric car that robotized, but an electric quad bike, as the system was made with an eye on the Eurathlon, a ticket to which secured the first place on Robocross 2012. A gazelle on mountain roads would be too large and difficult to control. One manual transmission adds a lot of difficulty.
Briefly about the competition
Quite in detail the task of "Robocross" is described in the article of the team "AVRORA ", deservedly won first place in the competition "Mul". On Robocross, the Mule mission was taken from last year’s Eurathlon. The robot should offline follow any mark (not a beacon), then return to the starting point, avoiding dynamic and static obstacles on the track. At Eurathlon there was a task “Autonomous navigation” - an unknown road in a mountain forest, given the coordinates of several key points that need to be passed. Slopes and ravines included.
I apologize for the size of the photos, with my internet they were flooded for half a day, with constant cliffs. Again, I can not stand it! Thank you, ex-yota.
')
Mechanics
As a chassis, the Razor Chinese Quad Bike was chosen, which presented a number of problems. The first one is a curve. In the literal sense of the word. When turning the wheels at a certain angle, one of them broke off from the ground. The second one is that the regular drive of the driving wheels is not adapted for reversing both mechanically and electrically. This was a big surprise (before the purchase, of course, we could not figure out anything, the documentation had weight, dimensions, maximum speed and advertising water). Third - the drive did not provide any rolling resistance when stopped. And at breakdown of the budget and technical design the brake drive was not laid. It brought a lot of joy and creativity, but much later. As a steering unit, a linear stepper actuator from the NPF Electric drive was used, then it seemed to be an excellent solution - it had to produce the absolute value of the rod movement, which allowed (in theory) to uniquely specify the position of the steering wheel. In practice, it turned out that with some resistance the drive skips steps and does not always keep the specified position. This worsened after one strong man turned the wheels manually under load on Robokross. The entire mechanical part, the fixings of all the equipment, and so on, were carried out without the involvement of industrial equipment (it simply did not exist) with the help of hand tools and a large number of unprinted expressions in about two weeks.
Electrician
The structure of the robot included the following "electrical appliances":
National Instruments cRIO-9118 real-time controller with a set of I / O modules;
industrial controller Advantech UNO-2184G for Windows;
HOKUYO UTM-30LX-EW lidar with a viewing angle of 270 °;
HOKUYO UBG-04LX-F01 short-range lidar with a viewing angle of 240 °;
Wi-Fi access point Ubiquiti RocKet M2 with antenna Ubiquiti AirMax Omni 2G13;
dome network camera Axis M3114-R;
Bins Company;
3G industry standard router with external antennas;
DC-DC converters;
52Ah lead batteries;
wires, buttons, toggle switches, twisted pair, terminal blocks and other small things.
Pre-alpha wiring version
In this part, too, was not without surprises - the lidar of the near radius worked for several hours and turned off forever. It was planned that he should remove the relief of the road, so for the "Mule" on the "Robokross" its presence was uncritical, and by the fall promised to repair. Not repaired, but more on that later. It was very interesting that all network equipment was powered by PoE 24 V, and the dome camera - from 48 V.
Programs and Algorithms
All software was implemented in National Instruments LabVIEW graphical programming language. The control system was divided into two parts, the "upper" and "lower". “Top” - Advantech UNO-2184G, on which data from the lidars and cameras were processed, solved the global problem of navigation, developing the required trajectory of movement. "Lower" - cRIO-9118, there was solved the local task of navigation and motion control (receiving and processing data from the SINS, determining the position of the robot in space, storing the desired trajectory of movement and algorithms for following it, drive control). In the most general case, the robot could operate in the mode of telecontrol or following coordinates without the involvement of a top-level controller.
With the lower level, everything is simple, it is difficult to find anything new and interesting there, although there will be questions - I can paint.
The upper level on the "Mule" solved two separate tasks: following the mark and returning the same way with a detour of obstacles.
After a number of experiments, it was decided to abandon the idea of ​​using a color mark, since the problems of white balance, backlight and other fun could go sideways, despite normalization and exposure compensation. A contrasting geometric label was used, the configuration of which should not occur on the range and in nature.
After pre-processing the image, a tag search was performed, then the PTZ camera was centered on it. The principle of operation of the tracking homing head is implemented in the most classic form. The system showed high security against changes in lighting conditions. At the output of this module, the angles of rotation of the camera and the scale of the found object with respect to a given pattern were obtained, which allowed us to uniquely determine the distance to the mark. Next, the horizontal pointing angle was determined using the turn matrix and transmitted to the local navigation level to follow the mark at a certain distance. If the tag was too close, too far or lost, the robot stopped and the siren changed its operating mode (according to the rules, light and sound alarm was required when moving), so the guide could, without turning around, understand what was happening to the robot. When moving, the coordinates of the points along which the robot passed to return along the trajectory were saved. At the second stage, the robot returned along the recorded trajectory, avoiding the obstacles that the refereeing team placed on the field. The following principle was used here: the algorithm sought to be in the closest possible vicinity of the point, while, according to data from the lidar, a picture of obstacles was constructed and the weights of the free sections were determined, their sizes and proximity to the required motion vector were estimated. As a result, a given course was obtained, as close as possible to the required one, but passing through free space. When reaching the maximum possible approximation to the current given point of the route, the robot switched to the next. An interesting incident occurred at the qualifying races: many people gathered in the finishing area, whom the robot did not consider possible to drive around, he turned around, and followed the route from the very beginning, re-passing the track in both directions. Later it turned out that this was due to an error in determining the index of the given coordinates in the array with the route, it got out only in the conditions of impossibility to reach the end point, but it looked interesting. “I'd rather go than go to people! Glory to the great Isaac! ”
During the competition, there was a nuisance due to the fact that some aspects of working with lidar were not foreseen, the first attempt was not counted, although the robot passed half of the route (following). In the intermediate finish zone (from where the robot must return autonomously), he simply refused to go. Did not even try to move. Debriefing showed that people in this area raised so much dust (very fine and sticky) that it settled on the lidar and represented an insurmountable wall. Naturally, collision protection did not allow the robot to move! As a result, ignoring all obstacles located inside the borders of the robot was added to the data processing from the lidar, and the lidar was wiped with a cloth before each race. During the tests, dust rose behind the robot, and there were significantly fewer people in the finishing area, so this did not happen. All not to provide, with something to meet ...
The whole system was thought out so that after switching on the robot practically didn’t require adjustment, except that control, that everything is correct, and the BINS exhibition. He exhibited and automatically, but for this it was necessary to drive twenty to thirty meters in a straight line in conditions of uncertain direction. When turned on, the robot automatically calibrated the steering gear and switched to remote control mode. After that, clicking on one of the control buttons on the case started following the label, the robot started to search for the label and follow it. Then, by pressing the next button, return to the starting point was turned on. The result was a user-friendly interface and a completely autonomous robot, even without any speech that had some connection with humans. Among other things, the robot showed an impressive power supply; 4-6 hours of continuous tests did not discharge the batteries by half. Yes, and he caught the person on the asphalt completely dragged (there was such an opportunity at the beginning of mining).
According to the results of the competition, the second place was taken, which impressed us very much (after such a failure with the lidar in the first race, because the place was calculated on the sum of two races). This robot had a significant time gain when turning in the intermediate finish zone, large gazelles needed a lot of time for this. In the end, it turned out that in just one race the track was passed so well that the robot was ahead of almost all the teams that received points for the sum of two races.
Eurathlon
This year, 13 teams from the European Union and one from Russia participated in the competition.
The news that there would never be a second lidar was a little discouraging, and the idea almost had no meaning without it. Attempts to apply Kinect showed that he does not consider as an obstacle everything that is lit by the bright sun. But things have been collected, the road has been paid for, and all that remained was to fight and search, find and hide away ...
I went on the road sick, so the first two days I devoted almost exclusively the temperature under 39, paracetamol and tea. Well, I walked a little, of course, hung out at the site of future competitions. But it was turned off from the work process. However, I did not miss anything - we waited for the driver to call back, who brought a box with a robot. The driver never called back, although the robot brought us even earlier. As a result of such long-range work, on-site work began two days later, when the organizers found the box. But this, as they say, working moments. Another working moment, on the first day, the organizers said that there would be electricity at the start, much later it turned out that the teams would not be given it, they had to have everything. This added creativity to our PCO, twisting from two spare batteries plus a red tape instead of a blue tape. We do not give up!
Berchtesgaden was beautiful, the beer was German and tasty, and English was understood by almost everyone, so the trip was a success. But now is not about that. The town turned out to be quite small, so I did not manage to buy an inverter to power laptops from the battery, although it was an interesting quest. With the help of a telephone Google translator, they translated the required phrase into German, not trusting the local knowledge of English, showed a magical inscription to many of the sellers found, but in vain. We had to work laptops from their batteries. Local sims for 3G Internet were also looking for a long time, it turned out the widest choice at gas stations, keep in mind!
At the first opportunity, having walked a dozen of mountain kilometers on foot, we studied the test track and found out that it is a road in the forest that is asphalted or dusted with rubble. Immediately the idea of ​​salvation was born. Find vegetation from the camera image and strive to go where it is not, not forgetting the direction to the set points. Fortunately, the pits and cliffs are also overgrown with grass, there was a chance that the robot would travel around them. Yes, and on the sides of the road grew tall grass, one of the lidars has not been canceled. Began a period of intensive experiments with the recognition of vegetation. They did not play around with stereo vision, so they decided not to try to fasten it for a couple of days. After long experiments and brainstorming (even with swearing, not without it), many options were screened out and I found a more or less working solution. Despite the illumination and the condition of the sky, the ratio of color components in the forest and on the road was significantly different, especially blue and green. As a result, the following stern-crutch algorithm was born: after a square-cluster blob, the ratios of the red and blue components to the green were taken with certain coefficients and separated by the calculated average border. The zones “forests” and “non-forests” were obtained. Then it was blurred again to kill any fallen leaves and zones were built in which it was necessary to try to move relative to the current position. Well, then - the desire for the next point of coordinates, trying not to drive off the road. Naturally, the limitations of the quad drive did not allow for a way out of the dead ends in reverse, so it only remained to rely on luck.
On the last evening before the start, a digital map was implemented with the current position of the robot. A photo of the name of the Google Map with a plotted over the top grid and the current position of the robot. It looked just great! The ability to monitor and even control the robot via the Internet, using the server left to work, was also realized at home. Even two, in different places (you never know). When checking in Germany, the delay was 100-150 ms, not so bad. And then, they say, in the forest WiFi is not very. But everything turned out to be much more fun. Another interesting discovery was the complete inability to brake the engine when driving downhill (and there were more biases there). It was decided (and we still do not give up!) Tightening the brakes with a pair of condoms. Excellent tires, not torn and with good elasticity! True, the brake disc and electric drive were heated very, very, but we didn’t need to dance for long.
A few words about competitors
The winning robot was a Volkswagen Touareg with a body kit of a 3D lidar, a swiveling stereo camera, two professional GPS receivers with uberentenny and much more inside. The cost of only the equipment exceeded our minimum by an order of magnitude. Plus, the development was carried out since 2008 at the institute under the auspices of the military. So it was not insulting to them to blow with our hand-made article for 25 kiloevro, executed for the summer. Yes, and the others were no worse; one tracked chassis made in a tank factory (no joke) is worth anything! In general, we were counting more on honorable participation than on real competition with these monsters.
Competition
On the day of the competition the weather did not work. It was raining, it was cold. I had to handicraft a waterproof steering wheel (again, hello to red tape). In the morning we checked all the systems and moved to the start. They hooked up a WiFi antenna to the board, checked 3G and a few were precipitated - there was no coverage in the launch area. When this was said to the organizer, he happily announced that it was one of the criteria for choosing the starting point. High mountain area, around the slopes, and even on the border of Germany and Austria, which did not cover the honeycomb still found, we are happy. And here is our turn. Turning on ... No communication with the top level controller! We remove from the start and urgently look for a monitor. Fortunately, found. It turned out that, having smelled the Internet, the Windows happily updated in the morning when checking the connection via 3G and the update began to conflict with some drivers. Epic fail. Quickly, rolling back the updates and turning off their nafig (forgot, eprst) began to gloomily freeze while waiting for the start “after all”. Again our turn. Check, steering calibration ... The mechanism podklinivaet in one of the extreme positions (remember the trouble with turning the wheels under the load on the "Robokross"?). It seems to be able to shake. But everything is shaky and sad. Having looked, that the road goes, basically, uphill, we decide to weaken kondomno-brake system. We leave to the starting point ... Let's go! The robot started as if it was going to take off. The brake was loosened, and the speed control regulator remained tuned to significant braking force. Oh. Nevertheless, writing out dangerous millimeters near the grass-covered holes and potholes, the robot rushed forward. The algorithm of color recognition of foliage, run-in in sunny weather, perfectly coped with the cloudy forest. At this point, I want to sing Ubiquiti with the most beautiful words. Weifai was perfectly worn through 100, 200, a few more meters of the forest. It is good that the road was not too overgrown from the start. Only a couple of meters of earth could stop this super-power miracle, so that. Nevertheless, driving along the edge of the next thickets, the robot unsuccessfully maneuvered and the steering wheel finally jammed. It was visible by telemetry - the required angle of rotation of the wheels is maximum, and we move strictly straight. It was the beginning of the end.Having somehow traveled a part of the route, more precisely flying a bullet, we withdrew from the race and I returned the toy to the start in the telecontrol mode (Ubiquiti!), Swinging the drive and approximately setting the middle position. Team assembled. From left to right: Andrey Lunev, MGTU im. N.E.Bauman; Polyakov Sergey, NAMT; Dubovitsky Vladimir, MSTU. N.E.Bauman (it's me); Kudryashov Igor, NAMT, team captain.
After this race, the judge (there was a typo “fate”) was shown a live video, a digital map and other buns, for which bonus points were added. We turned off, plunged and went to the base. Packed and sent the robot home, and they themselves began to wait for the decision of the judges, not hoping for anything.
The judges deliberated for a long time. But, already in the dark, the results were posted. We went to see how many points our pet scored. From the side it looked like this: “Let's go to the back of the tent, to the stand with the announcements. We looked there, that yes how, and began to laugh at the whole tent! ”Third! Third place on points! (Hello to the map and live video!) To say that this was unexpected means simply to remain silent. All evening indulged in a dull, tired and joyful surprise. Well, German beer, where do without it. The next day was the award. Shining like polished copper kettles, we set off to wait. Received a certificate of third place. And - new surprises - an award for the most creative development on the basis of the voting teams. He was called "perfect simple technology". Apparently impressed. And, probably, they somehow found out about the brake. In general, it