Objective Functions For Neural Network Classifier Design
- Creators
- Goodman, Rod
- Miller, John W.
- Smyth, Padhraic
Abstract
Backpropagation was originally derived in the context of minimizing a mean-squared error (MSE) objective function. More recently there has been interest in objective functions that provide accurate class probability estimates. In this talk we derive necessary and sufficient conditions on the required form of an objective function to provide probability estimates. This leads to the definition of a general class of functions which includes MSE and cross entropy (CE) as two of the simplest cases. We establish the equivalence of these functions to Maximum Likelihood estimation and the more general principle of Minimum Description Length models. Empirical results are used to demonstrate the tradeoffs associated with the choice of objective functions which minimize to a probability.
Additional Information
© 1991 IEEE. The research described in this talk was carried out in part by the Jet Propulsion Laboratories, California Institute of Technology, under a contract with the National Aeronautics and Space Administration. In addition this work was supported in part by the Air Force Office of Scientific Research under grant number AFOSR-90-0199.Attached Files
Published - 00695143.pdf
Files
Name | Size | Download all |
---|---|---|
md5:7143adf44827bdca95c4fea8f5368ef2
|
142.6 kB | Preview Download |
Additional details
- Eprint ID
- 78392
- Resolver ID
- CaltechAUTHORS:20170620-163501947
- NASA/JPL/Caltech
- Air Force Office of Scientific Research (AFOSR)
- 90-0199
- Created
-
2017-06-21Created from EPrint's datestamp field
- Updated
-
2021-11-15Created from EPrint's last_modified field