Dual arm estimation for coordinated bimanual manipulation
Abstract
This paper develops an estimation framework for sensor-guided dual-arm manipulation of a rigid object. Using an unscented Kalman Filter (UKF), the approach combines both visual and kinesthetic information to track both the manipulators and object. From visual updates of the object and manipulators, and tactile updates, the method estimates both the robot's internal state and the object's pose. Nonlinear constraints are incorporated into the framework to deal with the an additional arm and ensure the state is consistent. Two frameworks are compared in which the first framework run two single arm filters in parallel and the second consists of the augment dual arm filter with nonlinear constraints. Experiments on a wheel changing task are demonstrated using the DARPA ARM-S system, consisting of dual Barrett-WAM manipulators.
Additional Information
© 2013 IEEE. The author gratefully acknowledges the support from the National Science and Engineering Research Council of Canada (NSERC). The research described in this publication was carried out at the Jet Propulsion Laboratory, California Institute of Technology, with funding from the DARPA Autonomous Robotic Manipulation Software Track (ARM-S) program and the U.S. Army under the Robotics Collaborative Technology Alliance through an agreement with NASA.Additional details
- Eprint ID
- 73835
- Resolver ID
- CaltechAUTHORS:20170130-165743938
- National Sciences and Engineering Research Council of Canada (NSERC)
- Defense Advanced Research Projects Agency (DARPA)
- U. S. Army Robotics Collaborative Technology Alliance
- NASA/JPL/Caltech
- Created
-
2017-01-31Created from EPrint's datestamp field
- Updated
-
2021-11-11Created from EPrint's last_modified field