Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Best approach to calibrate kinematic parameters #96

Open
AndyZe opened this issue Aug 6, 2021 · 5 comments
Open

Best approach to calibrate kinematic parameters #96

AndyZe opened this issue Aug 6, 2021 · 5 comments
Labels

Comments

@AndyZe
Copy link
Contributor

AndyZe commented Aug 6, 2021

I'm interested in calibrating the kinematic parameters of a manipulator.

The best way I can think of to do this is:

  • Checkerboard mounted on the arm
  • Camera mounted & stationary in the world. Carefully measured so its transform is known

Can you help me confirm this setup would be correct then?

Chains: the MoveIt planning group with 6 robot joints
Features: checkerboard finder
Models: kinematic chain of the manipulator. Frame is the planning frame of this planning group, I suppose?
Free params: 6 joints of the robot
Free frames: none since we're assuming camera transform is perfectly known
Error blocks: camera3d_to_arm

@mikeferguson
Copy link
Owner

Yes, this setup should work. Basically you'd add a fixed joint in your URDF between the camera and the robot arm's root link.

  • Chains - yes, this is pretty much just configuring itself to move your arm to the desired poses. Most people are using MoveIt for planning by specifying the planning_group parameter.
  • Features - yep, the checkerboard
  • Models - the frame is actually the TIP of the chain - since the ROOT is already known (it's the base_frame that was specified).
  • Free params - yes, the 6 joints of the arm
  • Free frames - I would suggest also letting the camera pose be refined by the calibration - even if you're super accurate in locating the origin of the camera, small angular offsets are hard to measure. You may need a few more poses to get things to converge
  • Error blocks: yep, camera3d_to_arm will work for you

Your setup is pretty standard for this package, you might be interested to check out the Fetch calibration: https://github.com/fetchrobotics/fetch_ros/tree/melodic-devel/fetch_calibration/config - basically your files will be pretty similar to capture_checkerboards.yaml and calibrate.yaml, namely the things added there are:

  • A head chain (you don't have this)
  • Lots of free frames and parameters (we did camera intrinsics and extrinsic at the same time as joint angle offsets)

I should also mention I wrote a blog post last year which includes links to a ROSCON talk on this, as well as an (unaccepted for ICRA) paper on how the system works: https://www.robotandchisel.com/2020/04/09/robot-calibration-for-ros/

@AndyZe
Copy link
Contributor Author

AndyZe commented Aug 9, 2021

Thanks! One more question--

The checkerboard will be mounted on the end effector. How does robot_calibration know the checkerboard is attached to the end effector? And how does it know the transform?

I thought part of the answer is free_frames_initial_values, but isn't that in the base_link frame?

@AndyZe
Copy link
Contributor Author

AndyZe commented Aug 9, 2021

Given this setup:

free_frames:
 - name: checkerboard
   x: true
   y: true
   z: true
   roll: false
   pitch: true
   yaw: true
free_frames_initial_values: # what frame is this in? Is it the parent frame of checkerboard?
 - name: checkerboard
   x: 0.0
   y: 1.0
   z: 2.0
   roll: 0.0
   pitch: 0.0
   yaw: 0.0

I'm guessing it automatically finds a frame named checkerboard and takes those initial coordinates to be in whatever the parent frame of checkerboard is?

@mikeferguson
Copy link
Owner

Let's talk about the checkerboard frame first: "checkerboard" is a made up frame (since it does not exist in the URDF). The name of this frame can actually be set using a ROS parameter in the checkerboard finder.The "checkerboard" frame is then connected to the end of the arm (the "frame" of the "chain" - which in MoveIt/KDL terms is actually called the "tip" of the "chain"). Which arm to connect to is specified by the "chain_sensor_name" parameter (similarly, the "camera_sensor_name" can be used to specify from which camera to grab images).

Next, the "free_frames_initial_values" should really be thought of as "free_frames_initial_offsets" - as these are basically the offsets added to any particular frame transformation. In the case of "checkerboard", since it is virtual there is no other initial offset. For other frames in the robot, there is usually an offset specified for the joint/link in the URDF.

In either case, these offsets are in the frame of the parent link - effectively, this specifies the transformation from "wrist_roll_link" (or whichever frame is the "tip" of your manipulator) to "checkerboard" (the origin of the checkerboard).

And finally, one word of warning: you have pitch and yaw as free parameters, and roll as not a free parameter. This does not work as noted in the README: "Roll, pitch and yaw can also be set free, however it is important to note that because calibration internally uses an angle-axis representation, either all 3 should be set free, or only one should be free. You should never set two out of three to be free parameters."

Gritty Details Of What Is Actually Happening:

When the checkerboard finder detects a checkerboard it creates two observations - the first is the 3d pose of the checkerboard points in the camera frame, the second is the ideal poses in the checkerboard frame (based on the X/Y size and known size of the checkerboard squares) in this "checkerboard" frame. By making the transform that connects the checkerboard frame to the wrist end effector a free parameter, we can optimize that transform at the same time we optimize the rest of the robot free parameters until the reproduction of the points through the head camera+kinematics most closely matches the reproduction of the ideal checkerboard through the arm kinematics.

@AndyZe AndyZe closed this as completed Jul 14, 2022
@mikeferguson
Copy link
Owner

I'm going to reopen this until I transcribe some of the information into the README

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants