📜 ⬆️ ⬇️

We program robots - free robosimulator V-REP. The first steps



Robot programming is interesting.

Many probably have seen the Japanese humanoid robots, or the French training robot NAO , an interesting project of the trained robot-manipulator Baxter . KUKA Industrial Manipulators from Germany is a classic. Someone is programming a conveyor system (filtering, sorting). Delta robots. There is a whole layer - quadcopter control / stabilization algorithms. And of course the simple laborers in stock - Line Follower.
')
But all this as a rule is not cheap toys, so there is access to robots in specialized laboratories or institutes / schools where they received funding and there are these areas. All the rest of the developers (who are interested in robotics) - it remains to look enviously.

Some time ago I went to a rather interesting system - the 3D V-REP, from the Swiss company Coppelia Robotics.

To my (pleasant) surprise, I discovered that this system:

All objects that are programmed in this system — “live” in the real world from the point of view of physical laws — are gravity, you can capture objects, collisions, distance sensors, video sensors, etc.

Having worked for some time with this system, I decided to tell Habr readers about it.

Yes, and in the picture is a screenshot from V-REP, and models of robots - which you can program, and watch the behavior, right on your computer.

Installation

Install this system on your computer, in the Download section:


We see three options: educational (EDU), trial (EVAL), and player (player).

A player is a program with which you can play scenes created in the full version (that is, there is no editing capability) - free.

The trial version is a fully functional version in which there is no possibility to save. No license restrictions.

Educational is a full-featured package with licensing restrictions, the text of the license can be read here . Its essence is that institutions, schools, hobbyists - can use this software for free. Given that the use is not commercial (and educational).

We quite fit the definition of hobbyists (because we want to programmatically program robots), so feel free to download the EDU PRO version for your operating system.

At the moment, version 3.2.0, here is a direct link to windows option: V-REP_PRO_EDU_V3_2_0_Setup (98 Mb)

Start

After installation, and start we will see the screen:


Here we see the following objects:

- the scene - here the whole action takes place, at the moment it is empty (there is only sex)
- on the left we see a block with the model library - on top of the folder, and under it - the contents of the selected folder are displayed (robots / non-mobile are selected - that is, stationary robots are manipulators)
- the world hierarchy is displayed next

The hierarchy includes - the root object (world) in which all objects are located.

In our example, this is:


We see light sources, we see an object for realizing the floor (and this is a solid surface, with a texture), and a group for cameras.

There is a main script object that controls the scene and all objects on it, and each object can have its own script - internal scripts are implemented in the Lua language .

At the top and left we see the toolbar - menu. The most important button is the Play ( Start Simulation ) button - after which the scene simulation starts:


The scenario is the following:
- using DragAndDrop we drag objects from the model library.
- we correct their location
- we configure scripts
- we start the simulator
- we stop the simulator

Let's try something in practice.

Fast start

Let's try to revive the robot.

To do this, select the robots / mobile folder on the left and select Ansi in the list, capture, transfer to the scene and release, the robot appears on our scene and information about the author appears:


Now click on Start Simulation , and see the movement of the robot, and we can control the position of the head, hands (implemented through the Custom User Interface), here's a video:



Next, stop the simulation:


Control script

We can open and see the code that taught the robot to go (controls autonomous movement of the robot). To do this, on the hierarchy of objects, opposite the Asti model, double click on the “file” icon:


Here is the Lua program that performs the movement of the robot:
Asti robot motion control script
if (sim_call_type==sim_childscriptcall_initialization) then asti=simGetObjectHandle("Asti") lFoot=simGetObjectHandle("leftFootTarget") rFoot=simGetObjectHandle("rightFootTarget") lPath=simGetObjectHandle("leftFootPath") rPath=simGetObjectHandle("rightFootPath") lPathLength=simGetPathLength(lPath) rPathLength=simGetPathLength(rPath) ui=simGetUIHandle("astiUserInterface") simSetUIButtonLabel(ui,0,simGetObjectName(asti).." user interface") dist=0 correction=0.0305 minVal={0, -- Step size 0, -- Walking speed -math.pi/2, -- Neck 1 -math.pi/8, -- Neck 2 -math.pi/2, -- Left shoulder 1 0, -- Left shoulder 2 -math.pi/2, -- Left forearm -math.pi/2, -- Right shoulder 1 0, -- Right shoulder 2 -math.pi/2} -- Right forearm rangeVal={ 2, -- Step size 0.8, -- Walking speed math.pi, -- Neck 1 math.pi/4, -- Neck 2 math.pi/2, -- Left shoulder 1 math.pi/2, -- Left shoulder 2 math.pi/2, -- Left forearm math.pi/2, -- Right shoulder 1 math.pi/2, -- Right shoulder 2 math.pi/2} -- Right forearm uiSliderIDs={3,4,5,6,7,8,9,10,11,12} relativeStepSize=1 nominalVelocity=0.4 neckJoints={simGetObjectHandle("neckJoint0"),simGetObjectHandle("neckJoint1")} leftArmJoints={simGetObjectHandle("leftArmJoint0"),simGetObjectHandle("leftArmJoint1"),simGetObjectHandle("leftArmJoint2")} rightArmJoints={simGetObjectHandle("rightArmJoint0"),simGetObjectHandle("rightArmJoint1"),simGetObjectHandle("rightArmJoint2")} -- Now apply current values to the user interface: simSetUISlider(ui,uiSliderIDs[1],(relativeStepSize-minVal[1])*1000/rangeVal[1]) simSetUISlider(ui,uiSliderIDs[2],(nominalVelocity-minVal[2])*1000/rangeVal[2]) simSetUISlider(ui,uiSliderIDs[3],(simGetJointPosition(neckJoints[1])-minVal[3])*1000/rangeVal[3]) simSetUISlider(ui,uiSliderIDs[4],(simGetJointPosition(neckJoints[2])-minVal[4])*1000/rangeVal[4]) simSetUISlider(ui,uiSliderIDs[5],(simGetJointPosition(leftArmJoints[1])-minVal[5])*1000/rangeVal[5]) simSetUISlider(ui,uiSliderIDs[6],(simGetJointPosition(leftArmJoints[2])-minVal[6])*1000/rangeVal[6]) simSetUISlider(ui,uiSliderIDs[7],(simGetJointPosition(leftArmJoints[3])-minVal[7])*1000/rangeVal[7]) simSetUISlider(ui,uiSliderIDs[8],(simGetJointPosition(rightArmJoints[1])-minVal[8])*1000/rangeVal[8]) simSetUISlider(ui,uiSliderIDs[9],(simGetJointPosition(rightArmJoints[2])-minVal[9])*1000/rangeVal[9]) simSetUISlider(ui,uiSliderIDs[10],(simGetJointPosition(rightArmJoints[3])-minVal[10])*1000/rangeVal[10]) end if (sim_call_type==sim_childscriptcall_cleanup) then end if (sim_call_type==sim_childscriptcall_actuation) then -- Read desired values from the user interface: relativeStepSize=minVal[1]+simGetUISlider(ui,uiSliderIDs[1])*rangeVal[1]/1000 nominalVelocity=minVal[2]+simGetUISlider(ui,uiSliderIDs[2])*rangeVal[2]/1000 simSetJointTargetPosition(neckJoints[1],minVal[3]+simGetUISlider(ui,uiSliderIDs[3])*rangeVal[3]/1000) simSetJointTargetPosition(neckJoints[2],minVal[4]+simGetUISlider(ui,uiSliderIDs[4])*rangeVal[4]/1000) simSetJointTargetPosition(leftArmJoints[1],minVal[5]+simGetUISlider(ui,uiSliderIDs[5])*rangeVal[5]/1000) simSetJointTargetPosition(leftArmJoints[2],minVal[6]+simGetUISlider(ui,uiSliderIDs[6])*rangeVal[6]/1000) simSetJointTargetPosition(leftArmJoints[3],minVal[7]+simGetUISlider(ui,uiSliderIDs[7])*rangeVal[7]/1000) simSetJointTargetPosition(rightArmJoints[1],minVal[8]+simGetUISlider(ui,uiSliderIDs[8])*rangeVal[8]/1000) simSetJointTargetPosition(rightArmJoints[2],minVal[9]+simGetUISlider(ui,uiSliderIDs[9])*rangeVal[9]/1000) simSetJointTargetPosition(rightArmJoints[3],minVal[10]+simGetUISlider(ui,uiSliderIDs[10])*rangeVal[10]/1000) -- Get the desired position and orientation of each foot from the paths (you can also use a table of values for that): t=simGetSimulationTimeStep()*nominalVelocity dist=dist+t lPos=simGetPositionOnPath(lPath,dist/lPathLength) lOr=simGetOrientationOnPath(lPath,dist/lPathLength) p=simGetPathPosition(rPath) rPos=simGetPositionOnPath(rPath,(dist+correction)/rPathLength) rOr=simGetOrientationOnPath(rPath,(dist+correction)/rPathLength) -- Now we have the desired absolute position and orientation for each foot. -- Now transform the absolute position/orientation to position/orientation relative to asimo -- Then modulate the movement forward/backward with the desired "step size" -- Then transform back into absolute position/orientation: astiM=simGetObjectMatrix(asti,-1) astiMInverse=simGetInvertedMatrix(astiM) m=simMultiplyMatrices(astiMInverse,simBuildMatrix(lPos,lOr)) m[8]=m[8]*relativeStepSize m=simMultiplyMatrices(astiM,m) lPos={m[4],m[8],m[12]} lOr=simGetEulerAnglesFromMatrix(m) m=simMultiplyMatrices(astiMInverse,simBuildMatrix(rPos,rOr)) m[8]=m[8]*relativeStepSize m=simMultiplyMatrices(astiM,m) rPos={m[4],m[8],m[12]} rOr=simGetEulerAnglesFromMatrix(m) -- Finally apply the desired positions/orientations to each foot -- We simply apply them to two dummy objects that are then handled -- by the IK module to automatically calculate all leg joint desired values -- Since the leg joints operate in hybrid mode, the IK calculation results -- are then automatically applied as the desired values during dynamics calculation simSetObjectPosition(lFoot,-1,lPos) simSetObjectOrientation(lFoot,-1,lOr) simSetObjectPosition(rFoot,-1,rPos) simSetObjectOrientation(rFoot,-1,rOr) end 


Other models

You can delete a model - to do this, select it, and click on Del. And you can try to look at other models in the work, some have scripts for autonomous work.

Mobile robots


Stationary robots (manipulators)


Scene examples

There are also a large number of examples (scenes) that come with the program immediately. To do this, select the menu “File / Open scenes” and go to the folder: “V-REP3 / V-REP_PRO_EDU / scenes”.

Here are sample scenes (files with the * .ttt extension):
Sample scene files
2IndustrialRobots.ttt
3DoFHolonomicPathPlanning.ttt
6DoFHolonomicPathPlanning.ttt
BarrettHandPickAndPlace.ttt
blobDetectionWithPickAndPlace.ttt
ConstraintSolverExample.ttt
controlTypeExamples.ttt
e-puckDemo.ttt
environmentMapping.ttt
externalIkDemo.ttt
fabricationBlocks.ttt
fastClientServerCommunication.ttt
forwardAndInverseKinematics1.ttt
forwardAndInverseKinematics2.ttt
gearMechanism.ttt
genericDialogDemo.ttt
ghostDemo.ttt
ImageProcessingExample.ttt
inverseKinematicsOf144DofManipulator.ttt
jansenMechanism.ttt
katanaRobotWithCableSimulation.ttt
khepera3.ttt
LineTracer-threaded.ttt
millingMachine.ttt
millingRobot.ttt
motionPlanningAndGraspingDemo.ttt
motionPlanningDemo1.ttt
motionPlanningDemo2.ttt
motionPlanningDemo3.ttt
mouseTestScene.ttt
naturalSelectionAlgo.ttt
NonHolonomicPathPlanning.ttt
objectHandling.ttt
PaintingRobot.ttt
ParallelForwardAndInverseKinematics.ttt
practicalPathPlanningDemo.ttt
proximitySensorDemo.ttt
reflexxesMotionLibraryType4Demo.ttt
robotCollaboration1.ttt
robotCollaboration2.ttt
robotLanguageControl.ttt
rosTopicPublisherAndSubscriber.ttt
SocketAndTubeCommunicationExample.ttt
StripeScanner.ttt
weldingRobot.ttt
wirelessTransmission.ttt
youBotAndHanoiTower.ttt


Links

* V-REP main site
* user manual (in English)
* a large number of videos, examples from V-REP

To support the popularization of this interesting system in Russian, a Russian-speaking group on V-REP has been created .

Application in the educational process

In my opinion, V-REP has a good potential for application in educational processes. If you are interested in the use of the system in the educational process - at school, institute, in the club of robotics, etc. - you can fill out the questionnaire . It may be possible to consolidate efforts and make educational Russian-language materials.

Future plans


Of course this is only a small part of the capabilities of the V-REP system. In the following publication, by example, we consider the creation of the task of a racing simulator on a first-person robot typewriter. Consider the API. Creating objects, setting up the scene and user interaction.

Source: https://habr.com/ru/post/253357/


All Articles