Simulation in ROS

In order to make simulations with our robots in ROS, we are going to use Gazebo.

Gazebo (http://Gazebosim.org/) is a multi-robot simulator for complex indoor and outdoor environments. It is capable of simulating a population of robots, sensors, and objects in a three-dimensional world. It generates both realistic sensor feedback and physically plausible interactions between objects.

Gazebo is now independent from ROS and is installed as a standalone package in Ubuntu. In this section, we will learn how to interface Gazebo and ROS. You will learn how to use the model created before, how to include a laser sensor and a camera, and how to move it as a real robot.

Using our URDF 3D model in Gazebo

We are going to use the model that we designed in the last section, but to make it simple, we won't include the arm.

Make sure that you have Gazebo installed by typing the following command in a terminal:

$ gazebo

Before starting to work with Gazebo, we will install ROS packages to interface Gazebo:

$ sudo apt-get install ros-kinetic-gazebo-ros-pkgsros-kinetic-Gazebo-ros-control

The Gazebo GUI will open after this command. Assuming that all is working well, we will now prepare our robot model to be used on Gazebo. You can test the integration of Gazebo with ROS using the following commands and checking that the GUI is open:

$ roscore & rosrun gazebo_ros Gazebo

To introduce the robot model in Gazebo, we must complete the URDF model. In order to use it in Gazebo, we need to declare more elements. We will also use the .xacro file; although this may be more complex, it is more powerful for the development of the code. You will find a file with all the modifications at chapter4_tutorials/robot1_description/urdf/robot1_base_01.xacro.

<link name="base_link">
  <visual>
  <geometry>
    <box size="0.2 .3 .1"/>
  </geometry>
    <origin rpy="0 0 1.54" xyz="0 0 0.05"/>
    <material name="white">
    <color rgba="1 1 1 1"/>
  </material>
  </visual>
  <collision>
  <geometry>
    <box size="0.2 .3 0.1"/>
    </geometry>
  </collision>
  <xacro:default_inertial mass="10"/>
</link>

This is the new code for the chassis of the robot base_link. Notice that the collision and inertial sections are necessary to run the model on Gazebo in order to calculate the physics of the robot.

To launch everything, we are going to create a new .launch file. Create a new file with the name gazebo.launch in the chapter4_tutorials/robot1_gazebo/gazebo.launch folder and put in the following code:

<?xml version="1.0"?>

<launch>
  <!-- these are the arguments you can pass this launch file, for example paused:=true -->
  <arg name="paused" default="true" />
  <arg name="use_sim_time" default="false" />
  <arg name="gui" default="true" />
  <arg name="headless" default="false" />
  <arg name="debug" default="true" />
  <!-- We resume the logic in empty_world.launch, changing only the name of the world to be launched -->
  <include file="$(find gazebo_ros)/launch/empty_world.launch">
    <arg name="world_name" value="$(find robot1_gazebo)/worlds/robot.world" />
    <arg name="debug" value="$(arg debug)" />
    <arg name="gui" value="$(arggui)" />
    <arg name="paused" value="$(arg paused)" />
    <arg name="use_sim_time" value="$(arg use_sim_time)" />
    <arg name="headless" value="$(arg headless)" />
  </include>
  <!-- Load the URDF into the ROS Parameter Server -->
  <arg name="model" />
  <param name="robot_description" command="$(find xacro)/xacro.py $(arg model)" />
  <!-- Run a python script to the send a service call to gazebo_ros to spawn a URDF robot -->
  <node name="urdf_spawner" pkg="gazebo_ros" type="spawn_model" respawn="false" output="screen" args="-urdf -model robot1 - paramrobot_description -z 0.05" />
</launch>

To launch the file, use the following command:

$ roslaunch robot1_gazebo gazebo.launch model:="'rospack find robot1_description'/urdf/robot1_base_01.xacro"

You will now see the robot in Gazebo. The simulation is initially paused; you can click on play to start it. Congratulations! This is your first step in the virtual world:

Using our URDF 3D model in Gazebo

As you can see, the model has no texture. In rviz, you saw the textures that were declared in the URDF file, but in Gazebo you cannot see them.

To add visible textures in Gazebo, use the following code on your model .gazebo file. In robot1_description/urdf, create a file robot.gazebo:

<gazebo reference="base_link">
<material>gazebo/Orange</material>
</gazebo>

<gazebo reference="wheel_1">
<material>gazebo/Black</material>
</gazebo>

<gazebo reference="wheel_2">
<material>gazebo/Black</material>
</gazebo>

<gazebo reference="wheel_3">
<material>gazebo/Black</material>
</gazebo>

<gazebo reference="wheel_4">
<material>gazebo/Black</material>
</gazebo>

Copy the robot1_description/urdf/robot1_base_01.xacro file, save it with the name robot1_base_02.xacro, and add the following code inside:

<xacro:include filename="$(find robot1_description)/urdf/robot.gazebo" />

Launch the new file and you will see the same robot, but with the added textures:

$ roslaunch robot1_gazebo gazebo.launch model:="'rospack find robot1_description'/urdf/robot1_base_02.xacro"

You will see the following output:

Using our URDF 3D model in Gazebo

Adding sensors to Gazebo

In Gazebo, you can simulate the physics of the robot and its movement, and you can also simulate sensors.

Normally, when you want to add a new sensor, you need to implement the behavior. Fortunately, some sensors are already developed for Gazebo and ROS.

In this section, we are going to add a camera and a laser sensor to our model. These sensors will be a new part on the robot. Therefore, you need to select where to put them. In Gazebo, you will see a new 3D object that looks like a Hokuyo laser and a red cube that will be the camera. We talked about these sensors in the previous chapters.

We are going to take the laser from the gazebo_ros_demos package. This is the magic of ROS — you can use code from other packages for your development.

We must add the following lines to our .xacro file to add the 3D model of a Hokuyo laser to our robot:

<?xml version="1.0" encoding="UTF-8"?>
  <link name="hokuyo_link">
  <collision>
    <origin xyz="0 0 0" rpy="0 0 0" />
    <geometry>
      <box size="0.1 0.1 0.1" />
    </geometry>
  </collision>
  <visual>
    <origin xyz="0 0 0" rpy="0 0 0" />
    <geometry>
      <mesh filename="package://robot1_description/meshes/hokuyo.dae" />
    </geometry>
  </visual>
  <inertial>
    <massvalue="1e-5" />
    <originxyz="0 0 0" rpy="0 0 0" />
    <inertiaixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
  </inertial>
</link>

In our .gazebo file, we are going to add the plugin libgazebo_ros_laser.so that will simulate the behavior of a Hokuyo range laser:

<gazebo reference="hokuyo_link">
  <sensor type="ray" name="head_hokuyo_sensor">
    <pose>0 0 0 0 0 0</pose>
    <visualize>false</visualize>
    <update_rate>40</update_rate>
    <ray>
      <scan>
        <horizontal>
          <samples>720</samples>
          <resolution>1</resolution>
          <min_angle>-1.570796</min_angle>
          <max_angle>1.570796</max_angle>
        </horizontal>
      </scan>
      <range>
        <min>0.10</min>
        <max>30.0</max>
        <resolution>0.01</resolution>
      </range>
      <noise>
        <type>gaussian</type>
        <!-- Noise parameters based on published spec for Hokuyo laser achieving "+-30mm" accuracy at range <10m. A mean of 0.0m and stddev of 0.01m will put 99.7% of samples within 0.03m of the true reading. -->
        <mean>0.0</mean>
        <stddev>0.01</stddev>
      </noise>
    </ray>
    <plugin name="gazebo_ros_head_hokuyo_controller" filename="libgazebo_ros_laser.so">
      <topicName>/robot/laser/scan</topicName>
      <frameName>hokuyo_link</frameName>
    </plugin>
  </sensor>
</gazebo>

Launch the new model with the following command:

$ roslaunch robot1_gazebo gazebo.launch model:="'rospack find robot1_description'/urdf/robot1_base_03.xacro"

You will see the robot with the laser module attached to it.

In a similar way, we have added lines to robot.gazebo and robot1_base_03.xacro to add another sensor: a camera. Check these files!

In the following screenshot, you can see the robot model with the Hokuyo laser and a red cube that simulates the camera model:

Adding sensors to Gazebo

Notice that this laser is generating real data as a real laser. You can see the data generated using the rostopic echo command:

$ rostopic echo /robot/laser/scan

We can say the same about the camera. If you want to see the Gazebo simulation of the images taken, you can write the following command in a terminal:

$ rosrun image_view image_view image:=/robot/camera1/image_raw

Gazebo allows us to add objects to the world using the right menu. We have added some elements like a traffic cone, a table, and a can to check how the sensors react to them. You can see three screenshots showing this. The first image is that of Gazebo and our simulated world, then we have a top-down view of rviz with the laser data, and finally, an image visualization of the camera.

Adding sensors to Gazebo

Loading and using a map in Gazebo

In Gazebo, you can use virtual worlds such as offices, mountains, and so on.

In this section, we are going to use a map of the office of Willow Garage that is installed by default with ROS.

This 3D model is in the gazebo_worlds package. If you do not have the package, install it before you continue.

To check the model, all you have to do is start the .launch file using the following command:

$ roslaunch gazebo_ros willowgarage_world.launch

You will see the 3D office in Gazebo. The office only has walls. You can add tables, chairs, and much more, if you want. By inserting and placing objects, you can create your own worlds in Gazebo to simulate your robots. You have the option of saving your world by selecting Menu | Save As.

Please note that Gazebo requires a good machine, with a relatively recent GPU. You can check whether your graphics are supported at the Gazebo home page. Also, note that sometimes this software crashes, but great effort is being taken by the community to make it more stable. Usually, it is enough to run it again (probably several times) if it crashes. If the problem persists, our advice is to try it with a newer version, which will be installed by default with more recent distributions of ROS.

Loading and using a map in Gazebo

What we are going to do now is create a new .launch file to load the map and the robot together. To do that, create a new file in the robot1_gazebo/launch folder with the name gazebo_wg.launch and add the following code:

<?xml version="1.0"?>
<launch>
  <include file="$(find gazebo_ros)/launch/willowgarage_world.launch" /></include>
  <!-- Load the URDF into the ROS Parameter Server -->
  <param name="robot_description" command="$(find xacro)/xacro.py '$(find robot1_description)/urdf/robot1_base_03.xacro'" />
  <!-- Run a python script to the send a service call to Gazebo_ros 
  to spawn a URDF robot -->
  <node name="urdf_spawner" pkg="gazebo_ros" type="spawn_model" respawn="false" output="screen"
  args="-urdf -model robot1 -paramrobot_description -z 0.05"/>

</launch>

Now, launch the file of the model with the laser:

$ roslaunch robot1_gazebo gazebo_wg.launch

You will see the robot and the map on the Gazebo GUI. The next step is to command the robot to move and receive the simulated readings of its sensors as it moves around the virtual world loaded in the simulator.

Loading and using a map in Gazebo

Moving the robot in Gazebo

A skid-steer robot is a mobile robot whose movement is based on separately driven wheels placed on either side of the robot body. It can thus change its direction by varying the relative rate of rotation of its wheels, and it does not require an additional steering motion.

As we said before, in Gazebo you need to program the behaviors of the robot, joints, sensors, and so on. As for the laser, Gazebo already has a skid drive implemented, and we can use it to move our robot.

To use this controller, you only have to add the following code to the model file:

<gazebo>
  <plugin name="skid_steer_drive_controller" filename="libgazebo_ros_skid_steer_drive.so">
    <updateRate>100.0</updateRate>
    <robotNamespace>/</robotNamespace>
    <leftFrontJoint>base_to_wheel1</leftFrontJoint>
    <rightFrontJoint>base_to_wheel3</rightFrontJoint>
    <leftRearJoint>base_to_wheel2</leftRearJoint>
    <rightRearJoint>base_to_wheel4</rightRearJoint>
    <wheelSeparation>4</wheelSeparation>
    <wheelDiameter>0.1</wheelDiameter>
    <robotBaseFrame>base_link</robotBaseFrame>
    <torque>1</torque>
    <topicName>cmd_vel</topicName>
    <broadcastTF>0</broadcastTF>
  </plugin>
</gazebo>

The parameters that you can see in the code are simply the configuration set up to make the controller work with our four-wheeled robot. For example, we selected the base_to_wheel1, base_to_wheel2, base_to_wheel3, and base_to_wheel4 joints as wheels to move the robot.

Another interesting parameter is topicName. We need to publish commands with this name in order to control the robot. In this case, when you publish a sensor_msgs/Twist topic call /cmd_vel, the robot will move. It is important to have a well-configured orientation of the wheel joints. With the current orientation on the .xacro file, the robot will move upside-down, so we need to change the origin rpy for the four wheels, as shown in the following lines for the joint of the base link and the wheel1 joint:

<joint name="base_to_wheel1" type="continuous">
  <parent link="base_link"/>
  <child link="wheel_1"/>
  <origin rpy="-1.5707 0 0" xyz="0.1 0.15 0"/>
  <axis xyz="0 0 1" />
</joint>

All these changes are in the chapter4_tutorials/robot1_description/urfd/robot1_base_04.xacro file. In gazebo_wg.launch, we have to update the robot model in order to use the new file robot1_base_04.xacro.

Now, to launch the model with the controller and the map, we use the following command:

$ roslaunch robot1_gazebo gazebo_wg.launch

You will see the map with the robot on the Gazebo screen. We are going to move the robot using the keyboard. This node is in the teleop_twist_keyboard package that publishes the /cmd_vel topic.

Run the following commands in a terminal to install the package:

$ sudo apt-get install ros-kinetic-teleop-twist-keyboard
$ rosstack profile
$ rospack profile

Then, you can run the node as follows:

$ rosrun teleop_twist_keyboard teleop_twist_keyboard.py

You will see a new shell with some instructions and the keys to move the robot (u, i, o, j, k, l, m, ",", ".") and adjust maximum speeds.

Moving the robot in Gazebo

If everything has gone well, you can drive the robot across the Willow Garage office. You can see the laser data or visualized images from the camera.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset