Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1994 | public
Book Section - Chapter

LMS and backpropagation are minimax filters

Abstract

An important problem that arises in many applications is the following adaptive problem: given a sequence of n × 1 input column vectors {h_i}, and a corresponding sequence of desired scalar responses {d_i}, find an estimate of an n × 1 column vector of weights w such that the sum of squared errors, ∑^N_i=0 |d_i−h^T_iw|^2 , is minimized. The {h_i, d_i } are most often presented sequentially, and one is therefore required to find an adaptive scheme that recursively updates the estimate of w. The least-mean-squares (LMS) algorithm was originally conceived as an approximate solution to the above adaptive problem. It recursively updates the estimates of the weight vector along the direction of the instantaneous gradient of the sum squared error [1]. The introduction of the LMS adaptive filter in 1960 came as a significant development for a broad range of engineering applications since the LMS adaptive linear-estimation procedure requires essentially no advance knowledge of the signal statistics. The LMS, however, has been long thought to be an approximate minimizing solution to the above squared error criterion, and a rigorous minimization criterion has been missing.

Additional Information

© 1994 Kluwer Academic.

Additional details

Created:
August 20, 2023
Modified:
March 5, 2024