Giter VIP home page Giter VIP logo

cob_calibration's Introduction

cob_calibration

Calibrates cameras, arm and torso of Care-O-bot

Prepare the robot for calibration


described here

Automatic camera and robot calibration


Overview

This tutorial guides you through the automatic calibration of the cameras and kinematic components of the Care-O-bot 3. The calibration procedure is divided into two automated steps:

  1. Camera calibration: The stereo camera system of Care-O-bot is calibrated intrinsically.

  2. Kinematic robot calibration: This step performs extrinsic camera calibration (hand-eye calibration of the stereo camera and kinect) and estimates various kinematic parameters of the robot (position and orientation of arm and torso on the base of Care-O-bot).

Prerequisites

  • roscore is running
  • Care-O-bot bringup software is running.
  • Cameras, head-axis, arm and torso are initialized and working.
  • Calibration pattern is attached to arm.
  • You created a temporary overlay of "cob_calibration_data". The calibration results are stored in this unary stack. The overlay should be deleted again after the calibration process.

Running calibration

  1. Collect data

start the data collection by calling roslaunch cob_calibration_executive collect_robot_calibration_data.launch.

This will start all needed nodes and services. The robot now moves to the sample positions calculated in step 6 of the configuration.

The progress can be seen by rostopic echo /calibration/data_collection/progress.

Wait until capture is finished and stop it with CTRL-C. The bagfile with the measurement and the images for the camera calibration are stored in "/tmp/cal/".

  1. Calibrate cameras Execute roslaunch cob_camera_calibration calibrate_stereo.launch to start the stereo camera calibration.

The results will be stored in the calibration file specified in the "camera.yaml" configuration file.

This step should take approximately 3 minutes.

For robots with only one calibrated camera this step is not required.

  1. Calibrate robot To calibrate the robot run roslaunch cob_robot_calibration run_robot_calibration.launch.

The calculation takes about 5 to 10 minutes.

  1. Update urdf roslaunch cob_robot_calibration update_calibration_urdf.launch

copies the result of the optimization to the robot urdf file

  1. Restart bringup

Final steps

  1. Verify the calibration result with rviz by checking the kinect pointcloud while the arm with checkerboard is in front of the cameras. The point cloud should align with the simulated arm.
  2. Commit the new calibration to git (cob_calibration_data) and push it to github. Create a pull request for ipa320/cob_calibration_data and ask your robot administrator to pull the new calibration to the robot for everybody.
  3. Once the pull request has been accepted and the calibration has been updated on the robot, you can remove your local overlay of cob_calibration_data.
  4. Remove the checkerboard from the arm and reattach the hand. (activate the emergency stop before attaching the schunk hand).

Configure calibration(needs to be more verbose)


  1. create a new configuration folder in the package "cob_calibration_config" named after the robot.

  2. create the file "user_defined/cameras.yaml" define how many cameras are involved for each camera define "topic";"frame_id"(from robot urdf), " property"( position in urdf) "file_prefix" (for camera calibration)

  3. calibration_seed.yaml: teach in on hardware(or simulation)

  4. calibration_pattern: in most cases copy the existing cb9x6 for robots at the ipa

  5. generate template for optimization and autogenerated files bringup robot (in simulation or on real robot) roslaunch cob_robot_calibration generate_config.launch

  6. generate calibration positions with running ik services roslaunch cob_calibration_executive

  7. create free_0.yaml --> free_2.yaml for care-o-bot first step with free cb_arm transformation second step with cameras mount position added third step with all unknown transformations

cob_calibration's People

Contributors

fmessmer avatar ipa-cob3-8 avatar ipa-robotino avatar jabbenseth avatar sebhaug avatar squirrel-ci avatar supo-agentti avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.