Haptic controller

From Healthcare Robotics Wiki
Jump to: navigation, search

This page describes the revised haptic MPC control system architecture, what each module does and where the associate parameters reside, and how to run the system.

Quick-start - Simulation Demo


To run the haptic manipulaton demo, you will need to following:

  1. ROS Fuerte. Should also work fine on ROS Electric (limited testing):
  2. System Dependencies: python (2.7), OpenOpt, shapely, ROS Fuerte PR2 Simulator, rosdep, rosinstall
    • sudo apt-get install python python-openopt python-shapely ros-fuerte-pr2-desktop ros-fuerte-pr2-simulator python-rosdep python-rosinstall
  3. A directory for the installation, added to your ROS_PACKAGE_PATH, e.g.:
    • mkdir ~/ros_haptic_demo/ && cd ~/ros_haptic_demo && export ROS_PACKAGE_PATH=$ROS_PACKAGE_PATH:~/ros_haptic_demo
  4. Haptic Manipulation source code:
  5. Rosinstall the haptic manipulation files (will grab other source dependencies)
    • rosinstall ~/ros_haptic_demo/ gt-ros-pkg.hrl-haptic-manip/hrl_haptic_mpc/hrl_haptic_mpc.rosinstall
  6. Build all dependencies
    • rosmake hrl_haptic_mpc hrl_software_simulation_darpa_m3

Testing in Simulation

PR2 Simulation

  1. Run the PR2 simulator
    • roslaunch pr2_gazebo pr2_empty_world.launch
  2. Run a script to change the PID gains on the PR2 arm joint controllers. The standard ones are stiffer than ideal for the MPC.
    • roscd hrl_haptic_mpc && ./change_gains_pr2.sh
  3. Run the controller, specifying which arm you want to use (the command below will use the left arm).
    • roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=l
  4. Run Rviz
    • rosrun rviz rviz

Rviz Controls

  1. In Rviz, add an interactive marker with the update topic '/teleop_rviz_server/update'.
  2. Select 'Interact' from the top of the window to show interactive marker.
  3. Move the marker to some meaningful point in space.
    • Right click the marker and select "Go" from the popup menu. The controller should move the end effector to the specified position similar to a cartesian controller (without controlling orientation).
  4. Move the marker to a different location.
    • Right click the marker and select "Orient". The controller should move the gripper to match the marker (in both position and orientation).

ODE Planar Simulation

To see its behaviour when in contact with the environment, try the ODE planar simulation:

  1. Start the simulation (includes skin simulation):
    • roslaunch hrl_software_simulation_darpa_m3 gen_simulator.launch random:=1
  2. Start the controller for the sim:
    • roslaunch hrl_haptic_mpc start_sim_mpc.launch arm:=r
  3. Run rviz as before - the teleoperation interface is the same.
    • Note: 'Orientation', 'Stop', and 'No Skin' controls are not defined in the ODE simulation.
    • A provided config file will automatically load all necessary markers:
    • rosrun rviz rviz -d ~/ros_haptic_demo/gt-ros-pkg.hrl-haptic-manip/hrl_software_simulation_darpa_m3/viz_config/ode_simulation.vcg

Software Details


Debian packages

  • Python (2.7)
  • OpenOpt python package:
    • python-openopt
  • ROS Fuerte. Should also work fine on ROS Electric (limited testing):
  • ROS PR2 desktop package:
    • ros-fuerte-pr2-desktop
  • ROS PR2 simulation package (for testing everything before you try the robot):
    • ros-fuerte-pr2-simulator

Source checkouts

The source for this work can be found at https://code.google.com/p/gt-ros-pkg.hrl-haptic-manip/.

Additional required source checkouts can be installed using the rosinstall file in the haptic mpc directory. See 'hrl_haptic_mpc/hrl_haptic_mpc.rosinstall' for details of the git repositories.

Running the MPC

Using a launch file

The simplest way to run the controller is to roslaunch of the specified launch files. These launch files will handle most of the parameter initialisation and simplify things.


  • Start the skin:
    • roslaunch hrl_fabric_based_tactile_sensor pr2_tactile_sleeve_combined arm_to_use:=<l or r>
  • Start the controller for the PR2:
    • roslaunch hrl_haptic_mpc start_pr2_mpc.launch arm:=<l or r>


  • Start the skin and robot.
    • Meka: roslaunch cody cody.launch use_forearm_skin_sensor:=1 use_netft:=1 arm_to_use:=[r or b]
    • Fabric: roslaunch hrl_fabric_based_tactile_sensor cody_tactile_sleeve.launch
  • Start the controller for Cody. NB: There are two launch files depending on sensor - the Meka skin is ONLY on the right arm so this is enforced by the launch file.
    • roslaunch hrl_haptic_mpc start_cody_mpc_meka.launch
    • roslaunch hrl_haptic_mpc start_cody_mpc_fabric.launch

ODE Simulation:

  • Start the simulation (includes skin simulation:
    • roslaunch hrl_software_simulation_darpa_m3 gen_simulator random:=1
  • Start the controller for the sim:
    • roslaunch hrl_haptic_mpc start_sim_mpc.launch

Running scripts directly

Sometimes it may be preferable to run the controller scripts directly. Running the launch files will mean that much of the debug output is swallowed (only WARN level or above will print by default). Each of the controller nodes must be started manually. The controller will not act until it receives valid messages, hence the order is not critical.


  • ROBOT: Which robot parameters to use. Options: pr2, cody, sim3
  • ARM: Which arm to use on the multi-armed robots (pr2, cody). Options: l, r
  • SENSOR: Which sensor type to listen to. Options: fabric, hil

Sequence to execute:

  • roscd hrl_haptic_mpc/src/hrl_haptic_mpc
  • ./robot_haptic_state_node.py -r <ROBOT> -s <SENSOR> -a <ARM>
  • ./mpc_teleop_server.py -r <ROBOT>
  • ./arm_trajectory_generator.py -r <ROBOT> -a <ARM>
  • ./haptic_mpc.py -r <ROBOTNAME> -a <ARM>

Parameter files

The controller requires two parameter files to be loaded onto the ROS parameter server - one for the robot specific interfaces (eg, what skin topics to listen to and joint limits), and a second for controller parameters (weights associated with position/orientation, etc). For different robots you may want to copy one of the base ones and modify these to tune the controller's performance.


  • pr2_config_params.yaml - robot parameters
  • mpc_params_pr2.yaml - controller parameters


  • cody_config_params.yaml
  • mpc_params_cody.yaml

ODE Sim:

  • sim3_config_params.yaml
  • mpc_params_sim.yaml

To load these onto the parameter server, do the following. Make sure your ROS master is set correctly!

  • rosparam load <filename>

Control System Architecture


Tactile Skin Client

  • Filename: skin_client.py, plus robot specific versions (pr2_skin_client, sim_skin_client)
  • Arguments: None (don't run this directly - it's instantiated by the haptic state node).

The skin client class monitors a specified list of topics for TaxelArray messages. The "TaxelArray" message is a message containing lists of arbitrary length of taxel data: taxel centre (x,y,z), the force vector (x,y,z) and the force magnitude (x,y,z).

The skin client subscribes to all provided taxel array topics and buffers the data. It also provides an interface to add or remove subscriptions to taxel array topics on the fly. This allows the skin client to selectively ignore data based on some external variable. For example, if a joint is in a configuration which will fire a taxel due to skin bunching, it would be sensible to ignore this taxel data. This configuration dependent taxel logic will (soon) be added by Survy.

The functionality in skin_client.py is agnostic of the robot it is running on, however, due to the constraints of the original controller there is one function call which needs to know about robot specific joints. Currently this is implemented as a class which inherits from skin_client.py and overloads the function (which is defined in skin_client.py but throws an exception if not overloaded). This functionality will eventually be moved into robot_haptic_state to separate skin_client more cleanly.

Haptic State Publisher

  • Filename: robot_haptic_state_node.py
  • Arguments: -r <robot eg pr2, sim3, cody> -a <arm eg l, r> -s <sensor eg fabric, meka>

The haptic state publisher exists to centralise all the relevant haptic state data used by the controller, and publish it in one single message which the controller can use. This abstracts the robot specific parameters from the controller, simplifying changing robots, sensors, etc.

The robot haptic state parameter reads the current arm state and current skin state, then performs a number of calculations to derive the various Jacobian matrices. These are packaged in a simple haptic state message which is then published.



  • Robot specific. Accessed through the robot client class and the skin client class.


  • Haptic state: /haptic_mpc/robot_state

Waypoint Generator

  • Filename: waypoint_generator.py
  • Arguments: -r <robot eg pr2, sim3, cody> -a <arm eg l, r>

The waypoint generator exists as an interface between the teleoperation/motion planning modules and the controller. As inputs, it listens to both end goal poses and trajectories. If the provided trajectory is in joint space, the node will perform forward kinematics on the trajectory to convert it to cartesian space. The waypoint generator will attempt to interpolate any given goal and provide the controller with a local goal 50mm from the current end effector pose. These parameters are configurable, and the waypoint generator can be bypassed altogether if required (the controller will internally interpolate single goals). On hearing an updated goal of any sort, the internal buffers are flushed and the latest goal is used.



  • Single goal location: /haptic_mpc/goal_pose
  • Cartesian trajectory:
  • Joint trajectory:


  • Controller waypoint: /haptic_mpc/traj_pose

Haptic MPC

  • Filename: haptic_mpc.py
  • Arguments: -r <robot eg pr2, sim3> -a <arm eg l,r> [NB: Soon to be deprecated - the robot parameter was only required to get joint limits from a kinematics class]

The core haptic MPC node. This takes the packaged robot haptic state data and computes deltas for the desired joint angles. These are then sent back to the robot haptic state node to be sent to the robot as appropriate. The python class mostly listens for messages, unpacks the data and forms it appropriately for the maths functions found in epc_skin_math.py. This script formulates the problem for the QP solver.



  • Waypoint pose: /haptic_mpc/traj_pose
  • Robot haptic state: /haptic_mpc/robot_state


  • Delta Q_des: /haptic_mpc/delta_q_des
  • Q_des: /haptic_mpc/q_des (used only to reset the equilibrium angles when the controller starts up)

Teleop interface

  • Filename: mpc_teleop_rviz.py
  • Arguments: -r <robot eg pr2, sim3> -a <arm eg l,r>

The RViz teleop interface uses interactive markers to make it easy to send goals to the controller. It will display an appropriate marker for the gripper based on the robot input argument, allowing the user to drag the marker in space to the desired goal. To send the goal to the controller, right click the marker and select either "Go" or "Orient". "Go" will send the goal pose to the controller and enable it in position-only mode. Using "Orient" will do much the same, but will make the controller try to achieve the specified orientation as well as position.

The teleop interface menu also allows the gripper to be opened/closed and the skin re-zeroed.



  • N/A (some interactive markers but not important to the user).


  • Goal pose: /haptic_mpc/goal_pose

Adding a new robot to the controller

Adding a new robot type to the controller is simple. Most of the changes need to be made to the robot_haptic_state_node module.

  1. Create a robot client class. This class must provide basic kinematics/jacobian functions and also provide handlers to interface with the joint controllers on the arm. See pr2_arm_darpa_m3.py as an example.
  2. Create a skin client class which inherits from skin_client. See pr2_skin_client.py as an example.
  3. Modify the option parsing script in haptic_mpc_util.py to add your new robot type as a valid input argument. If necessary, add constraints to enforce certain sensor types or a left/right arm.
  4. Modify robot_haptic_state_node.py.
    1. Add an initialisation routine for your robot. This should set up all relevant parameters for your robot and instantiate the robot and skin clients classes.
    2. Ensure you are correctly setting up the joint limits used by the controller. These are stored on the parameter server at /haptic_mpc/joint_limits/[min, max]. The haptic state node should copy the appropriate joint limits to this location before the controller starts.
    3. Add a selective taxel handling routine for your robot. This functionality is not fully implemented yet for the PR2 (but is close). If taxels will always fire in certain configurations (eg, the wrist taxel on a PR2 when the wrist is bent), ignore it based on the joint positions.
  5. Create a YAML file with your robot configuration parameters (see pr2_params.yaml for an example).
  6. Create a YAML file with the appropriate controller parameters, if the standard ones don't suit (see mpc_params.yaml for an example).
  7. Test the controller. The nodes should be able to be started out of order, but it's a good idea to start the MPC last.
  8. If all goes well, create a launch file for your modified controller. See launch/start_pr2_mpc.launch for an example.

PR2 notes

NB: If the arm is running slowly on the PR2, it probably means you need to restart the arm controllers with lower gains. The arm should be relatively compliant if you push it away from its holding position.

To change these, perform the following steps:

  • roscd hrl_pr2_gains
  • ./change_gains_grasp.sh

The controllers will shut down (the arms will fall), and will restart when you press a key.