Navigation sensors are found not only on robots, but also on passenger vehicles, portable devices, such as cell phones, and on wheel-chairs and white-canes that aid people with disabilities. For several applications, sensor-fusion algorithms are employed to combine measurements from multiple sensors rigidly attached to the same vehicle, or spatially dispersed and mobile. In either case, their relative spatial configuration must be known. This introduces the need for sensor-to-sensor extrinsic calibration. Moreover, in order for sensor measurements to be useful for high-precision navigation, their placement on the host robot must be determined, which brings about the problem of sensor-to-body calibration.

To this day, very little is known about how to rigorously perform sensor-to-sensor and sensor-to-body extrinsic calibration. Typically, the transformations between the sensor and/or robot frames of interest are estimated using expensive calibration equipment, or found through approximate manual measurements and use of CAD plots. The objectives of this research effort are to develop rigorous methods for motion-induced sensor-to-sensor and sensor-to-body calibration, and to conduct a detailed theoretical study of their performance. The results of this work will significantly improve the quality and reduce the effort needed for multi-sensor calibration, thus providing valuable tools for designing and implementing navigation systems.

Project Report

For accurate autonomous navigation, mobile robots fuse information from multiple onboard navigation sensors such as cameras, laser scanners, wheel encoders, gyroscopes, and accelerometers that are rigidly attached at different locations on the robot’s body. For example, the camera is located on an elevated platform to maximize its field of view, while the wheel encoders are attached to the wheels. Since each sensor obtains measurements in its own local frame of reference, it becomes necessary to determine the spatial configuration between these sensors in order to correctly fuse their measurements. This is known as sensor-to-sensor extrinsic calibration. Moreover, in order for sensor measurements to be useful for high-precision navigation, their placement on the host robot or their environment must be determined, which brings about the problems of sensor-to-body and sensor-to-world calibration. A more challenging version of these problems involves fusing sensors’ information from multiple mobile robots in a team, which requires determining the spatial configuration between the robots themselves. Until recently, very little was known about how to rigorously perform sensor-to-sensor and sensor-to-body extrinsic calibration. Typically, the unknown transformations between the sensors and/or robot frames of interest was estimated using expensive calibration equipment, which in most cases is available only inside a laboratory. However, since the calibration parameters change over time (e.g., due to vibrations), this approach is unsuitable for recalibration of robots deployed in the field, leading to performance degradation over time. Another approach for determining the extrinsic calibration was through approximate manual measurements and the use of CAD plots. However, the limited accuracy of this approach prohibits its use for high-precision navigation tasks such as obstacle avoidance. Therefore, to address these drawbacks, the objective of this research effort was to design extrinsic calibration algorithms that utilize the sensors’ own measurement information, while in motion, to accurately determine the calibration parameters. For this proposed motion-induced extrinsic calibration approach, our research effort focused on designing algorithms for a variety of extrinsic calibration problems and addressing fundamental issues such as determining the minimum amount of information and the motion profile necessary for correctly determining calibration parameters. In particular, we considered extrinsic calibration problems both in 2D and 3D using distance and/or bearing measurements. These include sensor-to-sensor (wheel encoder-camera, inertial measurement unit-camera, 3D laser scanner-spherical camera), sensor-to-body (camera-body), and sensor-to-world (camera-world) extrinsic calibration for individual robots and also robot-to-robot extrinsic calibration for mobile robot teams. Distance measurements are often provided by telecommunication equipment (e.g., based on the time of flight of the signal used for communication). On the other hand, cameras provide images from which we can extract the bearing (direction) towards an object of interest. Using these types of measurements, we have carried out detailed theoretical analysis and provided solutions to both deterministic and probabilistic problem formulations. For deterministic problems, which assume that the measurement noise is negligible, we have provided solutions for both the over-determined (typically, when the number of scalar measurements is greater than the number of unknowns) and minimal problems (typically, when the number of scalar measurements equals the number of unknowns). The advantage of over-determined problems is that they are often easier to solve and their solutions can be computed very fast. However, in the presence of measurement outliers (incorrect measurements), minimal solvers are preferable. On the other hand, when the measurement noise cannot be ignored, we have provided solutions to probabilistic formulations of the calibration problems that seek to determine the maximum likelihood (i.e., the most likely) estimate, or at least minimize a chosen cost function (e.g., quadratic function of the noise or error in the data). For all these scenarios, our main objective has been to provide either closed form or analytical solutions that can be computed in real time. Summarizing, the results of this work will significantly improve the quality and reduce the effort needed for multi-sensor calibration, thus providing valuable tools for accurate design and prompt implementation and deployment of navigation systems. Moreover, this research effort finds applicability in a variety of domains. The algorithms developed are general enough to be used for solving general kinematics and inverse kinematics problems in real-time and high accuracy. Similarly, sensor-to-sensor calibration for multiple vehicles is necessary for tasks such as satellite formations. Lastly, navigation sensors are found on passenger vehicles, portable devices such as cellphones, and even on navigation aids such as wheelchairs and white canes for the elderly and the disabled. For these platforms, this work will enable the development of applications that require high-accuracy sensor fusion.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0811946
Program Officer
Richard Voyles
Project Start
Project End
Budget Start
2008-09-01
Budget End
2012-08-31
Support Year
Fiscal Year
2008
Total Cost
$397,999
Indirect Cost
Name
University of Minnesota Twin Cities
Department
Type
DUNS #
City
Minneapolis
State
MN
Country
United States
Zip Code
55455