Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 2013 | public
Journal Article

Detecting Motion through Dynamic Refraction

Abstract

Refraction causes random dynamic distortions in atmospheric turbulence and in views across a water interface. The latter scenario is experienced by submerged animals seeking to detect prey or avoid predators, which may be airborne or on land. Man encounters this when surveying a scene by a submarine or divers while wishing to avoid the use of an attention-drawing periscope. The problem of inverting random refracted dynamic distortions is difficult, particularly when some of the objects in the field of view (FOV) are moving. On the other hand, in many cases, just those moving objects are of interest, as they reveal animal, human, or machine activity. Furthermore, detecting and tracking these objects does not necessitate handling the difficult task of complete recovery of the scene. We show that moving objects can be detected very simply, with low false-positive rates, even when the distortions are very strong and dominate the object motion. Moreover, the moving object can be detected even if it has zero mean motion. While the object and distortion motions are random and unknown, they are mutually independent. This is expressed by a simple motion feature which enables discrimination of moving object points versus the background.

Additional Information

© 2013 IEEE. Published by the IEEE Computer Society. Manuscript received 7 Aug. 2011; revised 25 July 2012; accepted 22 Aug. 2012; published online 4 Sept. 2012. Recommended for acceptance by T. Pajdla. The authors would like to thank the anonymous reviewers for their useful comments. They thank Tomer Schechner, Yotam Michael, Amit Aides, and Amit Oved for their participation and help in the experiments. Experiments were conducted in the swimming pool facilities of Caltech, and the authors are grateful for this. They appreciate the occasional visitors in the Caltech pool and spa for useful discussions. They also thank Ohad Ben-Shahar for useful discussions about fish orientation. They thank Yuandong Tian and Srinivasa G. Narasimhan for sharing their code for distortion estimation. Yoav Schechner is a Landau Fellow—supported by the Taub Foundation. Marina Alterman's research is supported by the R.L. Kohns Eye Research Fund. This work relates to US Department of the Navy Grant N62909-10-1-4056 issued by the Office of Naval Research Global. The United States Government has a royalty-free license throughout the world in all copyrightable material contained herein.

Additional details

Created:
August 22, 2023
Modified:
October 20, 2023