Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2014 | Published + Supplemental Material
Journal Article Open

Monocular distance estimation from optic flow during active landing maneuvers

Abstract

Vision is arguably the most widely used sensor for position and velocity estimation in animals, and it is increasingly used in robotic systems as well. Many animals use stereopsis and object recognition in order to make a true estimate of distance. For a tiny insect such as a fruit fly or honeybee, however, these methods fall short. Instead, an insect must rely on calculations of optic flow, which can provide a measure of the ratio of velocity to distance, but not either parameter independently. Nevertheless, flies and other insects are adept at landing on a variety of substrates, a behavior that inherently requires some form of distance estimation in order to trigger distance-appropriate motor actions such as deceleration or leg extension. Previous studies have shown that these behaviors are indeed under visual control, raising the question: how does an insect estimate distance solely using optic flow? In this paper we use a nonlinear control theoretic approach to propose a solution for this problem. Our algorithm takes advantage of visually controlled landing trajectories that have been observed in flies and honeybees. Finally, we implement our algorithm, which we term dynamic peering, using a camera mounted to a linear stage to demonstrate its real-world feasibility.

Additional Information

© 2014 Institute of Physics. Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Received 21 July 2013, revised 1 October 2013. Accepted for publication 4 October 2013. Published 22 May 2014. The authors wish to thank Dr Michael Elzinga for help with constructing the linear rail mechanism used in our robotic implementation, and Nathan Powell for providing MATLAB code to run the square root implementation of the unscented Kalman filter. Funding was provided by the Hertz Foundation Graduate Research Fellowship (awarded to FvB), NSF Graduate Research Fellowship (awarded to FvB), Air Force Office of Scientific Research (FA9550-10-1-0368), and the Paul G. Allen Family Foundation Distinguished Investigator Award (awarded to MHD). The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication. The authors declare that no competing interests exist.

Attached Files

Published - 1748-3190_9_2_025002.pdf

Supplemental Material - bb481365movie.mov

Files

1748-3190_9_2_025002.pdf
Files (25.8 MB)
Name Size Download all
md5:080e11af26282c55953b84e763f41b08
934.7 kB Preview Download
md5:c960fae291e86d895a350d1549ae837b
24.9 MB Download

Additional details

Created:
August 20, 2023
Modified:
October 26, 2023