Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1992 | Published
Book Section - Chapter Open

Unsupervised Classifiers, Mutual Information and 'Phantom Targets'

Abstract

We derive criteria for training adaptive classifier networks to perform unsupervised data analysis. The first criterion turns a simple Gaussian classifier into a simple Gaussian mixture analyser. The second criterion, which is much more generally applicable, is based on mutual information. It simplifies to an intuitively reasonable difference between two entropy functions, one encouraging 'decisiveness,' the other 'fairness' to the alternative interpretations of the input. This 'firm but fair' criterion can be applied to any network that produces probability-type outputs, but it does not necessarily lead to useful behavior.

Additional Information

Copyright © Controller HMSO London 1992.

Attached Files

Published - 440-unsupervised-classifiers-mutual-information-and-phantom-targets.pdf

Files

440-unsupervised-classifiers-mutual-information-and-phantom-targets.pdf

Additional details

Created:
August 20, 2023
Modified:
January 13, 2024