OpenRAVE Documentation

calibrationviews Module

Calibrates a camera attached on a robot by moving it around a pattern.


Running the Example: --example calibrationviews


The pattern is attached to the robot gripper and robot uses moves it to gather data. Uses visibilitymodel to determine which robot configurations make the pattern fully visible inside the camera view.

It is also possible to calibrate an environment camera with this exapmle using: --example calibrationviews --scene=data/pa10calib_envcamera.env.xml --sensorrobot=ceilingcamera  


Although this example does not contain calibration code, the frames of reference are the following:


T_pattern^world and T_camera^link are unknown, while T_pattern^camera and T_link^world are known.


Usage: [options]

Views a calibration pattern from multiple locations.

  -h, --help            show this help message and exit
  --scene=SCENE         Scene file to load (default=data/pa10calib.env.xml)
                        Name of the sensor whose views to generate (default is first sensor on robot)
                        Name of the robot the sensor is attached to (default=none)
  --norandomize         If set, will not randomize the bodies and robot position in the scene.
  --novisibility        If set, will not perform any visibility searching.
  --noshowsensor        If set, will not show the sensor.
  --posedist=POSEDIST   An average distance between gathered poses. The smaller the value, the more poses robot will gather close to each

  OpenRAVE Environment Options:
                        List all plugins and the interfaces they provide.
                        Default collision checker to use
    --physics=_PHYSICS  physics engine to use (default=none)
    --viewer=_VIEWER    viewer to use (default=qtcoin)
    --server=_SERVER    server to use (default=None).
                        port to load server on (default=4765).
    --module=_MODULES   module to load, can specify multiple modules. Two arguments are required: "name" "args".
    -l _LEVEL, --level=_LEVEL, --log_level=_LEVEL
                        Debug level, one of (fatal,error,warn,info,debug,verbose,verifyplans)
    --testmode          if set, will run the program in a finite amount of time and spend computation time validating results. Used for

Main Python Code

def main(env,options):
    "Main example code."
    robot = env.GetRobots()[0]
    sensorrobot = None if options.sensorrobot is None else env.GetRobot(options.sensorrobot)
    time.sleep(0.1) # give time for environment to update
    self = CalibrationViews(robot,sensorname=options.sensorname,sensorrobot=sensorrobot,randomize=options.randomize)

    attachedsensor = self.vmodel.attachedsensor
    if options.showsensor and attachedsensor.GetSensor() is not None and attachedsensor.GetSensor().Supports(Sensor.Type.Camera):

    while True:
        print 'computing all locations, might take more than a minute...'
        if options.testmode:

Class Definitions

class openravepy.examples.calibrationviews.CalibrationViews(robot, sensorname=None, sensorrobot=None, target=None, maxvelmult=None, randomize=False)[source]

Starts a calibration sequencer using a robot and a sensor.

The minimum needed to be specified is the robot and a sensorname. Supports camera sensors that do not belong to the current robot, in this case the IK is done assuming the target is grabbed by the active manipulator of the robot Can use the visibility information of the target.

Parameters:sensorrobot – If specified, used to determine what robot the sensor lies on.
computeAndMoveToObservations(waitcond=None, maxobservations=inf, posedist=0.050000000000000003, usevisibility=True, **kwargs)[source]

Computes several configuration for the robot to move. If usevisibility is True, will use the visibility model of the pattern to gather data. Otherwise, given that the pattern is currently detected in the camera, move the robot around the local neighborhood. This does not rely on the visibiliy information of the pattern and does not create a pattern

computelocalposes(maxconeangle=0.5, maxconedist=0.14999999999999999, averagedist=0.029999999999999999, angledelta=0.20000000000000001, **kwargs)[source]

Computes robot poses using a cone pointing to the negative z-axis of the camera

computevisibilityposes(dists=array([ 0.05, 0.25, 0.45, 0.65, 0.85, 1.05, 1.25, 1.45]), orientationdensity=1, num=inf)[source]

Computes robot poses using visibility information from the target.

Sample the transformations of the camera. the camera x and y axes should always be aligned with the xy axes of the calibration pattern.

static gatherCalibrationData(robot, sensorname, waitcond, target=None, **kwargs)[source]

function to gather calibration data, relies on an outside waitcond function to return information about the calibration pattern

moveToConfiguration(config, waitcond=None)[source]

moves the robot to a configuration

moveToObservations(poses, configs, waitcond=None, maxobservations=inf, posedist=0.050000000000000003)[source]

order the poses with respect to distance

openravepy.examples.calibrationviews.main(env, options)[source]

Main example code.*args, **kwargs)[source]

Command-line execution of the example.

Parameters:args – arguments for script to parse, if not specified will use sys.argv


Having problems with OpenRAVE?