Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 2019 | Accepted Version
Conference Paper Open

Robust Features Extraction for On-board Monocular-based Spacecraft Pose Acquisition

Abstract

This paper presents the design, implementation, and validation of a robust feature extraction architecture for real-time on-board monocular vision-based pose initialization of a target spacecraft in application to on-orbit servicing and formation flying. The proposed computer vision algorithm is designed to detect the most significant features of an uncooperative target spacecraft in a sequence of two-dimensional input images that are collected on board the chaser spacecraft. A novel approach based on the fusion of multiple and parallel processing streams is proposed to filter a minimum number of extracted true point features, even in case of unfavourable illumination conditions and in presence of Earth in the background. These are then combined into relevant polyline structures that characterize the true geometrical shape of the target spacecraft.

Additional Information

© 2019 AIAA. AIAA Paper 2019-2005. The first author would like to thank the Swiss National Science Foundation (SNSF), that supported him for this research. The authors also gratefully acknowledge partial funding from Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration (NASA) in support of this work. The authors thank F. Y. Hadaegh, A. Stoica, and M. Wolf. In addition, the authors would like to thank Techno System Development s.r.l. that kindly made the images of Tango available for this research study.

Attached Files

Accepted Version - SciTech_2019_paper_Capuano.pdf

Files

SciTech_2019_paper_Capuano.pdf
Files (13.8 MB)
Name Size Download all
md5:65543534e5c53280865b090879255821
13.8 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023