The baxter_examples
programs described in the subsections within the Launching Baxter Simulator in Gazebo section also work on a real Baxter robot. Some additional arm control programs that work on a real Baxter but not on Baxter Simulator are described in the following sections.
This program is another example of joint position control for Baxter's arms. Baxter's arm is moved using the Zero-G mode to freely configure the arm's joints to the desired position. When the desired position is attained, the corresponding navigator button on the arm is pressed to record the waypoint position. This baxter_examples
program is executed with the following command, specifying either right or left for the arm that is to be moved:
$ rosrun baxter_examples joint_position_waypoints.py -l <right or left>
The output should be as follows:
... Press Navigator 'OK/Wheel' button to record a new joint position waypoint. Press Navigator 'Rethink' button when finished recording waypoints to begin playback. ...
On the navigator, the center button (scroll wheel) is the control used to record all seven joint angles of the specified arm's current position. Waypoints can be recorded repeatedly until the lower button (the button with the Rethink icon) is pressed. This Rethink button activates playback mode, when the arm will begin going back to the waypoint positions in the order that they were recorded. This playback mode will continue to loop through the waypoints until the Ctrl + C or Ctrl + Z key combination is pressed. Parameters for speed and accuracy can be passed with the joint_position_waypoints.py
command. Refer to Rethink's wiki site at http://sdk.rethinkrobotics.com/wiki/Joint_Position_Waypoints_Example.
This baxter_examples
program provides an example of Baxter's joint torque control. This program moves the arms into a neutral position, then applies joint torques at 1000 Hz to create an illusion of virtual springs. The program calculates and applies linear torques to any offset from the arm's starting position. When the arm is moved, these joint torques will return the arm to the starting position. Depending on the stiffness and damping applied to the joints, oscillation of the joints will occur.
This joint torque springs program is executed with the following command, specifying right or left for the arm that is to be manipulated:
$ rosrun baxter_examples joint_torque_springs.py -l <right or left>
The joint torques are configurable using the rqt reconfigure tool. To adjust the torque settings, type the following command in a new terminal:
$ rosrun rqt_reconfigure rqt_reconfigure
The following screenshot shows the rqt_reconfigure
screen for joint_torque_springs.py
for the left arm:
Select rsdk_joint_torque_springs
from the left panel to view the control menu. The spring stiffness and damping coefficient can be varied for each joint of the arm specified.
Rethink provides a simple baxter_examples
program to demonstrate the joint velocity control mode for Baxter's arms. This program begins by moving the arms into a neutral position. The joint velocity puppet program simply mirrors the movement of Baxter's arm when the other arm is moved in Zero-G mode. This baxter_examples
program is executed with the following command, specifying either right or left for the arm that is to be moved:
$ rosrun baxter_examples joint_velocity_puppet.py -l <right or left>
A parameter for amplitude can be passed with this command to change the velocity applied to the puppet response. For more information on this command, refer to Rethink's wiki site at http://sdk.rethinkrobotics.com/wiki/Puppet_Example.
The baxter_examples
programs also include programs for gripper control, camera control, and analog and digital input/output control. Refer to the Rethink wiki Baxter examples program site to get details on these programs: http://sdk.rethinkrobotics.com/wiki/Examples.
In addition, Rethink offers a series of video tutorials that provide information on everything from setting up Baxter to running the example programs. Referring to these videos may provide some help if you have problems with executing the example programs (http://sdk.rethinkrobotics.com/wiki/Video_Tutorials).
One of the greatest features of a real Baxter is the capability to detect and grasp an object. This capability is called visual servoing control. Baxter's cuff camera and gripper combination makes this a determined objective.
Baxter's cuff camera provides 2D camera images that can be processed by computer vision software such as OpenCV. OpenCV provides a vast library of functions for processing real-time image data. Computer vision algorithms for thresholding, shape recognition, feature detection, edge detection, and many more are useful for 2D (and 3D) perception.
An example of visual servoing from the Rethink website is available at http://sdk.rethinkrobotics.com/wiki/Worked_Example_Visual_Servoing.
This is a basic implementation linking object detection with the autonomous movement of the arm to grasp the object. This project is a good example of the technique described previously. Unfortunately, this example works with ROS Hydro and uses OpenCV functions that have been deprecated.
Only using Baxter's 2D cameras limits the accuracy of grasping objects, making the depth of objects in the entire scene hard to determine. Typically, programs such as the one previously mentioned require a setup phase, in which an infrared sensor measurement to the table surface is required. An alternative is to use an external 3D camera such as the Kinect, ASUS, PrimeSense, or RealSense to detect the depth of objects and match that information with the RGB camera data. This requires calibrating the two image data streams. The Open Source Robotics Foundation has demo software for both 2D perception and manipulation and 3D perception at https://github.com/osrf/baxter_demos.
The calculation of inverse kinematics to move the gripper to the desired location is also crucial to this process.