Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1994 | Published
Book Section - Chapter Open

H∞ Optimality Criteria for LMS and Backpropagation

Abstract

We have recently shown that the widely known LMS algorithm is an H∞ optimal estimator. The H∞ criterion has been introduced, initially in the control theory literature, as a means to ensure robust performance in the face of model uncertainties and lack of statistical information on the exogenous signals. We extend here our analysis to the nonlinear setting often encountered in neural networks, and show that the backpropagation algorithm is locally H∞ optimal. This fact provides a theoretical justification of the widely observed excellent robustness properties of the LMS and backpropagation algorithms. We further discuss some implications of these results.

Additional Information

© 1994 Morgan Kaufmann. This work was supported in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AFOSR91-0060 and in part by a grant from Rockwell International Inc.

Attached Files

Published - Hoo_Optimality_Criteria_for_LMS_and_Backpropagation.pdf

Files

Hoo_Optimality_Criteria_for_LMS_and_Backpropagation.pdf
Files (1.7 MB)

Additional details

Created:
August 20, 2023
Modified:
March 5, 2024