Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 2008 | Published
Journal Article Open

Class-Based Feature Matching Across Unrestricted Transformations

Abstract

We develop a novel method for class-based feature matching across large changes in viewing conditions. The method is based on the property that when objects share a similar part, the similarity is preserved across viewing conditions. Given a feature and a training set of object images, we first identify the subset of objects that share this feature. The transformation of the feature's appearance across viewing conditions is determined mainly by properties of the feature, rather than of the object in which it is embedded. Therefore, the transformed feature will be shared by approximately the same set of objects. Based on this consistency requirement, corresponding features can be reliably identified from a set of candidate matches. Unlike previous approaches, the proposed scheme compares feature appearances only in similar viewing conditions, rather than across different viewing conditions. As a result, the scheme is not restricted to locally planar objects or affine transformations. The approach also does not require examples of correct matches. We show that by using the proposed method, a dense set of accurate correspondences can be obtained. Experimental comparisons demonstrate that matching accuracy is significantly improved over previous schemes. Finally, we show that the scheme can be successfully used for invariant object recognition.

Additional Information

© Copyright 2008 IEEE. Reprinted with permission. Manuscript received 29 June 2006; revised 13 Feb. 2007; accepted 8 Oct. 2007; published online 16 Oct. 2007. [Date Published in Issue: 2008-07-16] Recommended for acceptance by L. Van Gool. This work was supported by ISF Grant 7-0369 and IMOS Grant 3-992. Dr. Bart was a postdoctoral associate at the Institute for Mathematics and Its Applications at the University of Minnesota while performing parts of this work. Portions of the research in this paper use the FERET database of facial images collected under the FERET program [30]. An early version of this paper appeared in [35].

Attached Files

Published - BARieeetpami08.pdf

Files

BARieeetpami08.pdf
Files (3.6 MB)
Name Size Download all
md5:8154106d32ecfd1cd338cabe56bb9ccc
3.6 MB Preview Download

Additional details

Created:
August 22, 2023
Modified:
October 16, 2023