Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1995 | Submitted + Published
Book Section - Chapter Open

H^∞ Optimal Training Algorithms and their Relation to Backpropagation

Abstract

We derive global H^∞ optimal training algorithms for neural networks. These algorithms guarantee the smallest possible prediction error energy over all possible disturbances of fixed energy, and are therefore robust with respect to model uncertainties and lack of statistical information on the exogenous signals. The ensuing estimators are infinite-dimensional, in the sense that updating the weight vector estimate requires knowledge of all previous weight esimates. A certain finite-dimensional approximation to these estimators is the backpropagation algorithm. This explains the local H6∞ optimality of backpropagation that has been previously demonstrated.

Additional Information

© 1995 Massachusetts Institute of Technology. This work was supported in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AFOSR91-0060 and by the Army Research Office under contract DAAL03-89-K-0109.

Attached Files

Published - 905-h-optimal-training-algorithms-and-their-relation-to-backpropagation.pdf

Submitted - Optimal_Training_Algorithms_and_their_Relation_to_Backpropagation.pdf

Files

905-h-optimal-training-algorithms-and-their-relation-to-backpropagation.pdf

Additional details

Created:
August 20, 2023
Modified:
March 5, 2024