PR2/Gazebo/Quick Start

MoveIt! is designed for use with real and simulated robots. In this tutorial, you will use MoveIt! with a simulated PR2 robot in Gazebo. You will learn how to configure MoveIt! for the controllers on the PR2. You will also learn how to integrate the sensors on the PR2 with MoveIt!

Note: This tutorial refers to a previous version of Gazebo that is no longer fully supported. In the future this tutorial needs to be updated to use the new gazebo_ros_pkgs interface.

Pre-requisites
You should have completed the previous tutorial on working with the MoveIt! Rviz Plugin.

Gazebo

 * You should have Gazebo and the PR2 simulator installed. If not, install it now as a debian package:

We will first generate and execute motion plans on the PR2 robot in simulation. In later sections of this tutorial, you will learn how to integrate perception with motion planning. Most of the configuration for using MoveIt! with a simulated robot has already been done in the previous tutorial. There are, however, a few things missing that we will now integrate.



Set ROS_PACKAGE_PATH
Set the ROS_PACKAGE_PATH to find the pr2_moveit_generated directory created by the Setup Assistant:

cd directory containing the pr2_moveit_generated directory export ROS_PACKAGE_PATH=$PWD:$ROS_PACKAGE_PATH

Configuring Controllers
The first step is to configure MoveIt! to talk to the controllers on the PR2. For each planning group, MoveIt! currently talks to controllers that offer the FollowJointTrajectory Action interface. We will configure controllers for each of the planning groups that we are primarily interested in using motion planning for: the right_arm and left_arm. There are two files that we need to populate for this to happen: the controllers.yaml config file and the pr2_moveit_controller_manager.launch file.

controllers.yaml file
This file should be created inside the config directory of the MoveIt! configuration package generated for your robot by the Setup Assistant. This file does not exist yet - it has not been auto-generated by the MoveIt! Setup Assistant. This file provides information to MoveIt! about the types of controllers available for use with each planning group. Here's the file configured for separate controllers for each arm of the PR2 robot (see the detailed documentation on interfacing controllers with MoveIt! for an in-depth explanation of each field): Copy this file into controllers.yaml and save it inside the config directory of your MoveIt! configuration package.

pr2_moveit_controller_manager.launch
This file has been auto-generated for you by the MoveIt! Setup Assistant (inside the launch directory of your MoveIt! generated configuration) but will currently be empty. Here are the complete contents of this file (again, see the detailed documentation on interfacing controllers with MoveIt! for an in-depth explanation of each field):

* Tip: Note that the last rosparam in the above file points to the controllers.yaml file you just created. Make sure it is pointing to the right location.

Create a Planning and Execution launch file

 * Now, you can create and launch a launch file (call it, e.g., moveit_planning_execution.launch) that brings up motion planning and execution. The complete launch file is given below, note how you have to set a parameter to tell move_group to publish its planning scene so that the Rviz plugin can use it too:

Launch MoveIt!
Use the launch file that you just created above:

Plan and Execute
You are almost ready to plan and execute trajectories. There are a couple of things still to do, though, based on the way the MoveIt! Rviz Plugin is structured. MoveIt! has a notion of the world around it which we call the Planning Scene. The Planning Scene contains all the parts of the world that the planners know about and will try to avoid. Currently, the Planning Scene you are using is completely empty since you have not hooked up any perception yet and you have not manually added any objects to the Planning Scene. You will learn how to do this in subsequent sections. First, you will plan and execute motion plans for the PR2 robot and watch the robot avoid any internal collisions.

&rarr;
 * Change the planning scene topic in the Rviz Motion Planning plugin to listen to "/move_group/monitored_planning_scene". The move_group node maintains the planning scene and publishes it. Setting this parameter allows the Rviz plugin to use the planning scene maintained by move_group.
 * Make sure the Planning Group is set to "left_arm" in the Planning Request tab.
 * Press the "Interact" button in the Display panel in Rviz. Move the goal request state (orange colored arm) for the left arm around. Make sure the desired goal state is collision-free,
 * Press the "Set Start To Current" button in the "Planning" tab at the bottom of the Motion Planning plugin. This will set the start configuration that you want to plan from to the current configuration of the robot. You will need to do this every time when planning with a real or simulated robot.
 * Press the "Plan" button in the "Planning" tab at the bottom of the Motion Planning plugin. In Rviz, you should be able to see a visualization of the planned trajectory (and a trail of the planned trajectory if you have the "Show Trail" checkbox checked). The "Execute" button should also become visible once you have a plan.
 * Now, press the "Execute" button and you should be able to see the robot execute the trajectory in Gazebo.

* Tip: Remember that you have to set the Start state to the current state every time before you start planning with a simulated robot in this manner.

Save the rviz config (save in File menu). You can now stop gazebo and rviz (CTRL-C in the shells they were started from).



Configure MoveIt! For Sensing With The PR2
In your exercises so far, none of the sensors on the robot have been used. MoveIt! only knows about internal collisions with the robot right now - if there were any objects in the environment, the arms will hit them since MoveIt! does not know that the objects exist. In this part of the tutorial, you will configure the sensors on the PR2 with MoveIt! To do this, you will have to add/modify two files (for now).

Adding a RGB-D Sensor on the PR2
The perception components of MoveIt! are currently setup to take as input a Point Cloud from 3D sensors. Any 3D sensor that generates point clouds can be integrated into MoveIt! provided the point clouds are registered in a frame that is available to MoveIt! through the ROS transform infrastructure (TF). We will illustrate the steps for configuring these sensors with MoveIt! with the example of the RGB-D sensor mounted on several PR2s. Note that this tutorial will only cover the basics of integrating perception with MoveIt! (more details can be found in the detailed documentation on integrating perception with the PR2).

Adding a sensor configuration file
To integrate a sensor, you will need to create a new configuration file that describes some properties of the sensor and how you would like to handle the incoming point clouds form the sensor.

A complete configuration file for the RGB-D sensor on the PR2 (save the complete text in a file named sensors_rgbd.yaml in your MoveIt! configuration directory):

Note the important properties you will need to fill out for each sensor:
 * sensor_plugin: the name of the plugin that should be loaded to perform the octomap updating
 * point_could_topic: The name of the topic on which point cloud data is being broadcast
 * max_range: The max range (in m) of the sensor

Update the pr2_moveit_sensor_manager.launch file
You will now need to update the pr2_moveit_sensor_manager.launch file in the "launch" directory of your MoveIt! configuration directory with this sensor information (this file was auto-generated by the Setup Assistant but was empty). You will need to add the following lines into that file to configure (a) The Octomap representation and specify (b) the set of sensor sources for MoveIt! to use:


 * MoveIt! uses an octree based framework to represent the world around it. The "Octomap" parameters above are configuration parameters for this representation:
 * octomap_frame specifies the coordinate frame in which this representation will be stored. If you are working with a mobile robot, this frame should be a fixed frame in the world.
 * octomap_resolution specifies the resolution at which this representation is maintained (in meters).
 * max_range specifies the maximum range value to be applied for any sensor input to this node.


 * The specification of the sensors_rgbd.yaml file tells MoveIt! which sensor sources to use for constructing the 3D representation it will use.

Sense, Plan and Execute
You are ready to add sensing into the mix: i.e. to sense, plan and execute.

Launch MoveIt!
Again, use the same launch file that you created earlier - don't worry, this includes the changes that you just made for sensing:

Visualize the Collision World
You can visualize the geometry that planning uses for collision avoidance (in the case described here, this is an Octomap only) by adding a PlanningScene plugin to Rviz and pointing the Planning Scene Topic to "/move_group/monitored_planning_scene".



This will give you a view of the world, as seen by the robot: (If your gazebo world is empty you will only see a few grid lines that the robot sees on the floor.)



Updating the Collision World
The collision world seen by the robot can be updated in multiple ways: using the Motion Planning Plugin for Rviz or by using Gazebo. To simulate changes in the environment, use the object toolbar in Gazebo:
 * select (click) an object type (for example, a sphere)
 * move the mouse pointer at a location in the environment where you would like to drop the object, and click again
 * using the Translation Mode provided by Gazebo, you can move that object around.



The steps above should procduce visible changes in the planning scene displayed by Rviz.



Move the PR2 head around
You can move the head around under keyboard control by (in another terminal window) running:

sudo apt-get install ros-groovy-pr2-apps roslaunch pr2_teleop_general pr2_teleop_general_keyboard_bodyhead_only.launch

(more info here)

Plan and Execute
Now, just like you did earlier, you can set goal states and plan and execute trajectories. The PR2 will avoid anything it can see (which in this empty world is not much). In subsequent tutorials, you will learn how to add things into the world. Remember to always set the start state of the robot to the current state of the robot before planning and executing anything.


 * Note that the arms of the robot are not considered external obstacles. In fact, they are filtered out from the sensor data. You can learn more about how to configure this particular component in the advanced MoveIt! perception tutorials.

Links

 * Back to Quick Start
 * Back to Main Page