Inertial-Aided Vision-Based Localization and Mapping in a Riverine Environment with Reflection Measurements
Abstract
This paper presents an inertial-aided vision-based localization and mapping algorithm for an unmanned aerial vehicle (UAV) that can operate in a GPS-denied riverine environment. We take vision measurements from the features surrounding the river and their corresponding points reflected in the river. We apply a robot-centric mapping framework to let the uncertainty of the features be referenced to the UAV body frame and estimate the 3D positions of point features while estimating the location of the UAV. We demonstrate the localization and mapping results with sensors on our quadcopter UAV platform in the University of Illinois at Urbana Champaign Boneyard Creek. The UAV is equipped with a light weight monocular camera, an inertial measurement unit (IMU) which contains a magnetometer, an ultrasound altimeter, and an on-board computer. To our knowledge, we report the first result of performing localization and mapping by exploiting multiple views with reflections of features in a river-like environment.
Additional Information
© 2013 American Institute of Aeronautics and Astronautics. This material is based in part upon work supported by the Office of Naval Research (N00014-11-1-0088) and John Deere. We would like to thank Xichen Shi, Sunil Patel, Martin Miller, Simon Peter, Ghazaleh Panahandehnigjeh, and Shubham Gupta for their help.Additional details
- Eprint ID
- 72430
- DOI
- 10.2514/6.2013-5246
- Resolver ID
- CaltechAUTHORS:20161130-084925184
- Office of Naval Research (ONR)
- N00014-11-1-0088
- Created
-
2016-11-30Created from EPrint's datestamp field
- Updated
-
2021-11-11Created from EPrint's last_modified field
- Caltech groups
- GALCIT
- Other Numbering System Name
- AIAA Paper
- Other Numbering System Identifier
- 2013-5246