Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 2010 | Published
Book Section - Chapter Open

Discriminative Clustering by Regularized Information Maximization

Abstract

Is there a principled way to learn a probabilistic discriminative classifier from an unlabeled data set? We present a framework that simultaneously clusters the data and trains a discriminative classifier. We call it Regularized Information Maximization (RIM). RIM optimizes an intuitive information-theoretic objective function which balances class separation, class balance and classifier complexity. The approach can flexibly incorporate different likelihood functions, express prior assumptions about the relative size of different classes and incorporate partial labels for semi-supervised learning. In particular, we instantiate the framework to unsupervised, multi-class kernelized logistic regression. Our empirical evaluation indicates that RIM outperforms existing methods on several real data sets, and demonstrates that RIM is an effective model selection method.

Additional Information

©2010 Neural Information Processing Systems. We thank Alex Smola for helpful comments and discussion, and Thanos Siapas for providing the neural tetrode data. This research was partially supported by NSF grant IIS-0953413, a gift from Microsoft Corporation, and ONR MURI Grant N00014-06-1-0734.

Attached Files

Published - 4154-discriminative-clustering-by-regularized-information-maximization.pdf

Files

4154-discriminative-clustering-by-regularized-information-maximization.pdf
Files (982.2 kB)

Additional details

Created:
August 19, 2023
Modified:
January 13, 2024