Published May 25, 2016 | Published
Book Section - Chapter Open

Stereo vision-based obstacle avoidance for micro air vehicles using an egocylindrical image space representation

An error occurred while generating the citation.

Abstract

Micro air vehicles which operate autonomously at low altitude in cluttered environments require a method for onboard obstacle avoidance for safe operation. Previous methods deploy either purely reactive approaches, mapping low-level visual features directly to actuator inputs to maneuver the vehicle around the obstacle, or deliberative methods that use on-board 3-D sensors to create a 3-D, voxel-based world model, which is then used to generate collision free 3-D trajectories. In this paper, we use forward-looking stereo vision with a large horizontal and vertical field of view and project range from stereo into a novel robot-centered, cylindrical, inverse range map we call an egocylinder. With this implementation we reduce the complexity of our world representation from a 3D map to a 2.5D image-space representation, which supports very efficient motion planning and collision checking, and allows to implement configuration space expansion as an image processing function directly on the egocylinder. Deploying a fast reactive motion planner directly on the configuration space expanded egocylinder image, we demonstrate the effectiveness of this new approach experimentally in an indoor environment.

Additional Information

© 2016 Society of Photo-optical Instrumentation Engineers (SPIE). This work was funded by the Army Research Laboratory under the Micro Autonomous Systems & Technology Collaborative Technology Alliance program (MAST-CTA). JPL contributions were carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

Attached Files

Published - 98361R.pdf

Files

98361R.pdf
Files (972.9 kB)
Name Size Download all
md5:58ec43e444cec6bcd6114e555aff8f31
972.9 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
January 14, 2024