Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 2010 | Published
Book Section - Chapter Open

The Fastest Pedestrian Detector in the West

Abstract

We demonstrate a multiscale pedestrian detector operating in near real time (~6 fps on 640x480 images) with state-of-the-art detection performance. The computational bottleneck of many modern detectors is the construction of an image pyramid, typically sampled at 8-16 scales per octave, and associated feature computations at each scale. We propose a technique to avoid constructing such a finely sampled image pyramid without sacrificing performance: our key insight is that for a broad family of features, including gradient histograms, the feature responses computed at a single scale can be used to approximate feature responses at nearby scales. The approximation is accurate within an entire scale octave. This allows us to decouple the sampling of the image pyramid from the sampling of detection scales. Overall, our approximation yields a speedup of 10-100 times over competing methods with only a minor loss in detection accuracy of about 1-2% on the Caltech Pedestrian dataset across a wide range of evaluation settings. The results are confirmed on three additional datasets (INRIA, ETH, and TUD-Brussels) where our method always scores within a few percent of the state-of-the-art while being 1-2 orders of magnitude faster. The approach is general and should be widely applicable.

Additional Information

© 2010. The copyright of this document resides with its authors. It may be distributed unchanged freely in print or electronic forms. P. P. was supported by ONR MURI Grant #N00014-06-1-0734. S. B. was supported in part by NSF CAREER Grant #0448615 and ONR MURI Grant #N00014-08-1-0638.

Attached Files

Published - paper68.pdf

Files

paper68.pdf
Files (1.5 MB)
Name Size Download all
md5:f091950b92b17e6c698fed338325a205
1.5 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023