[Documentation] [TitleIndex] [WordIndex

Note: This tutorial has been written and tested on Ubuntu 14.04 with ROS Indigo.
(!) Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags.

Extrinsic calibration of the David SLS-2 mounted on a robot

Description: This tutorial shows how to perform the extrinsic calibration of a David SLS-2 3D sensor on an industrial Fanuc robot.

Tutorial Level: INTERMEDIATE

Description

This tutorial is a step by step guide on how to calibrate a David SLS-2 sensor mounted on a Fanuc robot. It aims beginners willing to perform the extrinsic calibration of 3D sensors; everything you will learn applies to other sensors/robots.

Motivation

Performing the extrinsic calibration allows you to know precisely where the sensor is in regards to the robot, in other words with a calibrated sensor the point clouds fetched are aligned in the robot frame. In practice this means that if you scan a cube, extract the corner coordinates of one edge of the box and feed the robot these poses, the robot will move along the edge.

This procedure is called "extrinsic calibration" "hand-eye" calibration; you should not call it "calibration" as it is confusing with the intrinsic calibration.

Hardware requirements

The hardware requirements varies depending on which sensor you use, with the David SLS-2 you will need an extra computer running Windows that you may not need using other sensors!

Software requirements

Preparing the extrinsic calibration

Before going on the robot and actually performing the extrinsic calibration you need to prepare a catkin work-space with the necessary packages.

This tutorial has been written for Fanuc robots and a David SLS-2 sensor but most of it will apply to other robots brands supported in ROS Industrial and other sensors as long as you have a driver.

Packages you need

In this tutorial we use a Fanuc robot and a David SLS-2 sensor:

Creating the sensor package

You need to create the sensor definition and also a node that will publish the sensor images on a ROS topic.

In the sensor definition you have to make sure that the axis have the right orientation like shown on the picture below (the camera frame is very important, the projector is not important):

SLS-2 definition

You can test this definition with

roslaunch sls_2_description test_sls_2.launch

Note that the distance/angle between the projector and the camera can be modified when tweaking the red ruler. You must use values close the reality when including the sls_2.xacro file in your robot definition (see next chapter).

Creating the robot package

You need to take special care of the robot package. Here is a description of the packages inside the fanuc_m900ib_vision sub-directory:

To test the robot package use:

roslaunch fanuc_m900ib_sls_2_vision_support test_m900ib_vision.launch

Fanuc M-900iB/700 with vision end effector

The robot description includes de SLS-2 sensor with the right camera to projector distance/angle: m900ib_vision_macro.xacro.

You will notice that the robot has more than 6 axis, this is the mutable joint state publisher; it adds a series of transformation between the tool0 of the robot and the sls_2_frame. It also adds a series of transformation between the robot base and the target_frame; which represents the target the camera will observe to perform the extrinsic calibration.

This is done in the URDF of the robot:

The fanuc_m900ib_sls_2_vision_moveit_config package should be generated with the MoveIt setup assistant of your robot has nothing particular, just make sure to create at least two kinematic groups

Adjusting the mutable joint states

Before starting the extrinsic calibration you must give an approximation value for each of the joint state publisher:

This maximize the chances of the optimization algorithm of the industrial_extrinsic_cal to find correct values. If you do not set these values correctly the optimization algorithm might fail or worse, give a wrong answer.

To adjust the joint values launch the test file (test_m900ib_vision.launch) of the robot support package and use the following commands to tweak the joint states:

To get a list of the joints of your robot:

rostopic echo /joint_states

To get a mutable joint state value:

rosservice call /get_mutable_joint_states "joint_names:
- 'sls_2_frame_yaw_joint'"

To set a mutable joint state value:

rosservice call /set_mutable_joint_states "joint_names:
- 'sls_2_frame_yaw_joint'
joint_values:
- 1.5707"

Adjust the values of each of the 12 joints (6 for the camera, 6 for the target).

When you have found all 12 values that matches reality, copy them in the mutable_joint_states.yaml file.

Creating the ressource package

The ressource package contains:

The sls_2_camera_def.yaml defines the camera, focal length etc. Some of these values must be copied after performing the calibration of the sensor via the David SDK.

Perform the calibration of the sensor with the David SDK software : Calibration of the David SLS-2 Calibration of the David SLS-2

Then export the camera.cal and projector.cal from the David software and copy the values accordingly, in the sls_2_calibration.yaml. This is necessary because the David SLS-2 uses an specific frame when capturing point clouds. Copying these values makes sure we can reverse the transformation to get the point cloud in the camera frame (which we are going to calibrate).

You can use caljob_create.launch to create an extrinsic calibration job:

At the end the job should be written in caljob.yaml. Of course, when creating the job you must put a calibration target in front of your robot and make sure that the camera is able to locate the pattern for each pose (more info in the next item).

Creating the caljob

You can use target_locator.launch to test on each pose of the trajectory if the target can be located. If it is not, make sure that the image exposure is correct and that you are not too far/close from the target.

In the yaml directory you will find the calibration target definition and a corresponding SVG file ready for printing.

Performing the extrinsic calibration

Before launching the calibration, start the robot and the driver, put the robot in auto mode and make sure ROS is able to communicate both with the robot and the sensor.

Launch the procedure:

roslaunch fanuc_r1000ia_sls_2_grinding_resources calibrate_sls_2_r1000ia_grinding.launch davidSDK_ip:=192.168.100.43 robot_ip:=192.168.100.200 sim:=false

In a second terminal:

rosservice call /calibration_service "allowable_cost_per_observation: 2.0"

The robot will start moving around the target and the terminal display information about each pose capture.

Extrinsic calibration on the Linux computer

At the end of the job, the mutable joint state publisher values are updated and you will see the sensor moving in RViz. If the orientation or translation of the sensor dramatically changed then something is wrong either with your initial mutable joint states values or with the extrinsic calibration results.

The new mutable joint states values are copied in the terminal and a new file is created in the yaml directory. However it is your job to update the values with the new one (if they are satisfying.

Troubleshooting:


2019-05-18 12:46