Monocular Vision SLAM for Indoor Aerial Vehicles
Abstract
This paper presents a novel indoor navigation and ranging strategy by using a monocular camera. The proposed algorithms are integrated with simultaneous localization and mapping (SLAM) with a focus on indoor aerial vehicle applications. We experimentally validate the proposed algorithms by using a fully self-contained micro aerial vehicle (MAV) with on-board image processing and SLAM capabilities. The range measurement strategy is inspired by the key adaptive mechanisms for depth perception and pattern recognition found in humans and intelligent animals. The navigation strategy assumes an unknown, GPS-denied environment, which is representable via corner-like feature points and straight architectural lines. Experimental results show that the system is only limited by the capabilities of the camera and the availability of good corners.
Additional Information
© 2009 IEEE. Date of Conference: 10-15 Oct. 2009. Date Added to IEEE Xplore: 15 December 2009. The research reported in this paper was in part supported by National Science Foundation (Grant ECCS-0428040), Information Infrastructure Institute (I3), Department of Aerospace Engineering and Virtual Reality Application Center at Iowa State University, and Air Force Office of Scientific Research.Attached Files
Published - 05354050.pdf
Files
Name | Size | Download all |
---|---|---|
md5:0448ea5c130666ca0421d94d72088932
|
1.8 MB | Preview Download |
Additional details
- Eprint ID
- 72258
- Resolver ID
- CaltechAUTHORS:20161122-144014712
- NSF
- ECCS-0428040
- Iowa State University
- Air Force Office of Scientific Research (AFOSR)
- Created
-
2016-11-22Created from EPrint's datestamp field
- Updated
-
2021-11-11Created from EPrint's last_modified field
- Caltech groups
- GALCIT