📜 ⬆️ ⬇️

The robot that will follow your smile. We make a cheap trolley for learning ROS. Part 2, software

Moving on to the smile


Having collected the “burger” according to the scheme from the last post , we will move on to the program content.

Since we are collecting for an already completed project, it is logical to give the instructions indicated in it. They are here .

Everything is very convenient and in the same place you can download a ready-made image from Raspbian Stretch + ROS + OpenCV, write it to an sd card for raspberry. (ROS Kinetic, OpenCV 3.4.1. Yes, there is a newer one, but sometimes it's better to take and go than to collect everything from the source).

Nevertheless, despite the convenience, I still had to slightly correct the image. As it turned out some uncomfortable details of the original image:
')

The finished image is here , and further work will be built on its basis.

Configure the network (wi-fi) and ROS-master on raspberry pi

.
I strongly recommend that you use a separate router with your wi-fi for experiments. You can simply create an access point on the phone for these purposes. This is due to the fact that a lot of packets will fly over wi-fi and, preferably, they do not sink in the general traffic.

After uploading the image to the raspberry sd card, configure the network. The initial network settings are as follows:

interface wlan0 static ip_address=192.168.43.174/24 static routers=192.168.43.1 static domain_name_servers=192.168.43.1 

Contained in /etc/dhcpcd.conf

Therefore, you can not connect the hoses to raspberry to change everything, but simply create an access point with the pathos name boss and password 1234554321. The address of raspberry will be 192.168.43.174. Besides ssh, you can access this address via VNC: login - pi, password - 123qweasdzxcV.

Set up the ROS master

A small remark for those who have not encountered ROS (robotic operation system). A ROS master is an intermediary through which different nodes communicate in ros (nodes, services, etc.) If the ros master is not running or is running at the wrong address, the nodes will not see each other.

In our ROS system, the wizard starts automatically with the OS loading and all that is required of us is to specify the IP address for the ROS wizard in the corresponding system file.

If you have not changed the network settings that are listed above, then you do not need to configure anything.

Otherwise, edit bashrc:

 nano ~/.bashrc 

At the very end of the file, correct the ip addresses (both) for your case:

 export ROS_MASTER_URI=http://192.168.43.174:11311 export ROS_HOSTNAME=192.168.43.174 

Reboot.

Now, when starting the terminal on a cart, the output will be like this (or whatever you specified in the settings):

 For all slaves, "export ROS_MASTER_URI=http://192.168.43.174:11311" 

This means that the ROS master works at the specified ip address.

We control the trolley on wi-fi


We’ll check from the beginning that the nodes work for us.

In terminal:

 rosnode list 

The output will be like this:

/ rosout
/ uno_serial_node

If nothing came out, then check if you registered ROS-master in the settings as described above, whether you connected the usb hose to arduino, rebooted.

After checking, run the 1st node responsible for the movement:

 rosrun rosbots_driver part2_cmr.py 

* special ros command launches part2_cmr.py file from rosbots_driver python package

The system will inform that the node is running:



Here you can see that the radius of the wheels and the distance between them are determined. You can fix these values, as well as other ones related to movement in the robot.py file along the path

 /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/examples/coursera_control_of_mobile_robots/part2/full/controller 

since part2_cmr.py itself does not have these parameters. Open the second terminal and enter the rostopic list:



Here you can see that the topic / part2_cmr / cmd_vel has appeared. In this topic, / part2_cmr "listens" to what other nodes will say to it and, depending on what they say, will control the movement. What exactly “listens”, but not “speaks” can be understood using the command.

 rostopic info /part2_cmr/cmd_vel 



Here you can see that / part2_cmr subscriber (subscribed) to the topic and listens.

* You can "say" something on the topic yourself, without nodes.

For example:

 rostopic pub -1 /wheel_power_left std_msgs/Float32 '{data: 1.0}' 

turn forward with the left wheel

 rostopic pub -1 /wheel_power_left std_msgs/Float32 '{data: 0.0}' 

stop left wheel

 rostopic pub -1 /wheel_power_left std_msgs/Float32 '{data: -1.0}' 

Turn back to back with a wheel

 rostopic pub -1 /wheel_power_left std_msgs/Float32 '{data: -0.5}' 

Turn back with the left wheel slower.

The syntax is: rostopic pub - desire to speak in topic, -1 - one-time desire, / wheel_power_left - topic where we speak, std_msgs / Float32 - language (message format), '{data: -0.5}' - what we say.

Now run the one who will talk in the topic / part2_cmr / cmd_vel. This will be the keyboard command sending node.

Without closing the previous terminal with a working node, run another one and enter:

 rosrun teleop_twist_keyboard teleop_twist_keyboard.py /cmd_vel:=/part2_cmr/cmd_vel 

* Since publication is by default in the topic / cmd_vel, we redirect it using
/ cmd_vel: = / part2_cmr / cmd_vel so that messages are poured into / part2_cmr / cmd_vel.

The control node has started and you can train by pressing the keys on the keyboard:



If it’s impossible to drive, or there is a subtle squeak from under the wheels, you need to increase the speed by clicking on “w” in the terminal with the node running. The same (increase or decrease) can be done with the rotation speed - the “e” button. It is also important to be in a terminal with a running node if the control buttons do not work if you switch to another terminal. The “k” button in the control terminal is a stop.

In a separate terminal, let's look at the topic / part2_cmr / cmd_vel:



Now in the topic / part2_cmr / cmd_vel there is both a speaker and a listener.

Riding the line on OpenCV


Before you go somewhere, you need to make sure that the robot travels with keyboard control. An important remark is needed here. When controlling from the keyboard in the example above, a left turn should correspond to pressing j, right l (Latin l), forward i, back, (comma). If this is not the case in your case, then there may be problems with the trip. To bring everything back to normal, you need to change the wire pairs coming from the engine driver to the legs on arduino in our burger 4,5,6,7 arduino: 4,5 interchange with 6,7 or 4 and 5,6 and 7 each with another depending on where the wheels will spin. You can also do this programmatically by adjusting the code for arduino along the path - /home/pi/gitspace/rosbots_driver/platformio/rosbots_firmware/examples/motor_driver/src/main.cpp

 #define M_LEFT_PWM 6 #define M_LEFT_FR 7 #define M_RIGHT_PWM 5 #define M_RIGHT_FR 4 

and reloading it on arduino with the command:

 upload_firmware ~/gitspace/rosbots_driver/platformio/rosbots_firmware/examples/motor_driver 

Let's work with flowers

Our gardening experience will be to highlight the line on the floor that the robot will travel along, to determine its color. By default, the robot does not see it. As the line, you can use either adhesive tape (yellow) or electrical tape or something else with a characteristic color and quite wide. * Transparent adhesive tape is unlikely to work, because it will be difficult to distinguish from the background.

Let's go into the folder and run the script:

 cd /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/rosbots_driver python bgr-to-hsv.py 

*Attention! If you use the original image from rosbots, and not mine, this program is not there.

Two windows will open:



Here are the ranges of colors in HSV. What is hsv and why not rgb, please google it yourself.

h1, s1, v1 - lower and h2, s2, v2 - respectively, the upper range.

Now you need to select a line with electrical tape (perhaps not tape but tape) on the floor by moving the sliders in the window. Only the line of electrical tape should remain in the result window:



The line of electrical tape is unusually white, everything else is black. This result is necessary.
Record, remember the numbers of the HSV ranges. My case is 56,155,40 and 136,255,255. The HSV ranges will be different under different light conditions near the robot camera.

Close the windows by entering ctrl + c in the terminal and add the HSV ranges to the follow_line_step_hsv.py file:

 cd /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/rosbots_driver nano follow_line_step_hsv.py 

In the lines:

 lower_yellow = np.array([21,80,160]) upper_yellow = np.array([255,255,255]) 

We put the numbers of our HSV ranges.

Time to ride the line

We start the motor node in terminal 1:

 rosrun rosbots_driver part2_cmr.py 

Launch the camera node in the second terminal:

 sudo modprobe bcm2835-v4l2 roslaunch usb_cam usb_cam-test.launch 

Run the opencv node in the third terminal:

 cd /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/rosbots_driver python follow_line_step_hsv.py 

If everything went well, then the robot will go along the line, and an additional window will appear:



In this window, the electrical tape will be marked with a red circle.

The general meaning of the code is to select a color segment at a certain distance from the camera, draw a red circle and go to this circle, trying to keep it in the center.

Finally, about the important - about cats and smiles


Since our goal is to go to the cat or to a smiling person, we will have to use something more complicated in our code. We will also need cats and smiling people. The second is now more difficult: few people smile at this difficult, alarming time. So let's start with cats.

For experiments, photos of cats in the face are suitable.

Run the camera node in the 1st terminal:

 cd /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/rosbots_driver python pi_camera_driver.py 

In the 2nd terminal, the motor node:

 rosrun rosbots_driver part2_cmr.py 

In the 3rd terminal of the cat search node:

 cd /home/pi/rosbots_catkin_ws/src/rosbots_driver/scripts/rosbots_driver python follow_cat2.py 

The cart will gradually move to the cat:



Now you need a volunteer who knows how to smile. Take a portrait of a little-known public figure in a small country.

In the 3rd terminal of the cat’s search node, you can close - ctrl + c and instead of it start searching for a smile on the face of a little-known public person:

 python follow_smile.py 

The cart will have to slowly, incredulously drive to the smile of a little-known person:



As many may have already guessed, the scripts that we ran use Haar cascades. By the same principle as with a trip along the line, a square of the desired area is highlighted and the program tries to keep it in the center by moving the robot.

Unfortunately, the performance on raspberry 3b leaves much to be desired, despite the camera settings of 320x240 and 15 Fps. Delays are noticeable with increasing time. Not every cat can stand it.

How can this be improved?

Try rebuilding optimized opencv, as Adrian recommends (https://www.pyimagesearch.com/2017/10/09/optimizing-opencv-on-the-raspberry-pi/)? Use external PC resources for image processing? Try not to compress images in jpeg that fly to the Haar handler? And one more big minus - cats should be big and in front. 15 cm spacing on A4 sheet. When moving away from the camera, the cat is already unrecognizable and invulnerable. Put raspberry monocle on camera with 8x magnification?

PS: If you get your hands on experiments with the image that is given in the article, then you can still ride for different parts of the body, accordingly launching instead of the cat node:

 python follow_fullbody.py python follow_upperbody.py python follow_lowerbody.py 

face or eye:

 python follow_face.py python follow_right_eye.py 

If there is interest in how to smoothly move away so that the robot does not spill tea, and also how to manage it not with raspberry itself, write.

Source: https://habr.com/ru/post/461131/


All Articles