Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2019 | Supplemental Material + Published
Journal Article Open

Minimal Achievable Sufficient Statistic Learning

Abstract

We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that {—} unlike standard mutual information {—} can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.

Additional Information

© 2019 by the author(s). We would like to thank Georg Pichler, Thomas Vidick, Alex Alemi, Alessandro Achille, and Joseph Marino for useful discussions.

Attached Files

Published - cvitkovic19a.pdf

Supplemental Material - cvitkovic19a-supp.pdf

Files

cvitkovic19a.pdf
Files (964.7 kB)
Name Size Download all
md5:7bacb781fdcdb75994c9dcfd66f71067
602.9 kB Preview Download
md5:e92ce2d40adb09ac2a69d6d065d1fa77
361.9 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 19, 2023