Minimal Achievable Sufficient Statistic Learning
- Creators
- Cvitkovic, Milan
- Koliander, Günther
Abstract
We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that {—} unlike standard mutual information {—} can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.
Additional Information
© 2019 by the author(s). We would like to thank Georg Pichler, Thomas Vidick, Alex Alemi, Alessandro Achille, and Joseph Marino for useful discussions.Attached Files
Published - cvitkovic19a.pdf
Supplemental Material - cvitkovic19a-supp.pdf
Files
Name | Size | Download all |
---|---|---|
md5:7bacb781fdcdb75994c9dcfd66f71067
|
602.9 kB | Preview Download |
md5:e92ce2d40adb09ac2a69d6d065d1fa77
|
361.9 kB | Preview Download |
Additional details
- Eprint ID
- 101455
- Resolver ID
- CaltechAUTHORS:20200221-102413824
- Created
-
2020-02-21Created from EPrint's datestamp field
- Updated
-
2020-02-21Created from EPrint's last_modified field