Published 1992
| Published
Book Section - Chapter
Open
Unsupervised Classifiers, Mutual Information and 'Phantom Targets'
Chicago
Abstract
We derive criteria for training adaptive classifier networks to perform unsupervised data analysis. The first criterion turns a simple Gaussian classifier into a simple Gaussian mixture analyser. The second criterion, which is much more generally applicable, is based on mutual information. It simplifies to an intuitively reasonable difference between two entropy functions, one encouraging 'decisiveness,' the other 'fairness' to the alternative interpretations of the input. This 'firm but fair' criterion can be applied to any network that produces probability-type outputs, but it does not necessarily lead to useful behavior.
Additional Information
Copyright © Controller HMSO London 1992.Attached Files
Published - 440-unsupervised-classifiers-mutual-information-and-phantom-targets.pdf
Files
440-unsupervised-classifiers-mutual-information-and-phantom-targets.pdf
Files
(1.1 MB)
Name | Size | Download all |
---|---|---|
md5:6bee0cade883855756fcb9d99c8b3a03
|
1.1 MB | Preview Download |
Additional details
- Eprint ID
- 63784
- Resolver ID
- CaltechAUTHORS:20160119-164045651
- Created
-
2016-01-20Created from EPrint's datestamp field
- Updated
-
2019-10-03Created from EPrint's last_modified field
- Series Name
- Advances in Neural Information Processing
- Series Volume or Issue Number
- 4