Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published May 2012 | public
Book Section - Chapter

Combined shape, appearance and silhouette for simultaneous manipulator and object tracking

Abstract

This paper develops an estimation framework for sensor-guided manipulation of a rigid object via a robot arm. Using an unscented Kalman Filter (UKF), the method combines dense range information (from stereo cameras and 3D ranging sensors) as well as visual appearance features and silhouettes of the object and manipulator to track both an object-fixed frame location as well as a manipulator tool or palm frame location. If available, tactile data is also incorporated. By using these different imaging sensors and different imaging properties, we can leverage the advantages of each sensor and each feature type to realize more accurate and robust object and reference frame tracking. The method is demonstrated using the DARPA ARM-S system, consisting of a Barrettâ„¢WAM manipulator.

Additional Information

© 2012 IEEE. Date of Current Version: 28 June 2012. The author gratefully acknowledges the support from the National Science and Engineering Research Council of Canada (NSERC). The research described in this publication was carried out at the Jet Propulsion Laboratory, California Institute of Technology, with funding from the DARPA Autonomous Robotic Manipulation Software Track (ARM-S) program and the U.S. Army under the Robotics Collaborative Technology Alliance through an agreement with NASA.

Additional details

Created:
August 19, 2023
Modified:
October 17, 2023