Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 1991 | Published
Journal Article Open

How much information can one bit of memory retain about a Bernoulli sequence?

Abstract

The maximin problem of the maximization of the minimum amount of information that a single bit of memory retains about the entire past is investigated. The problem is to estimate the supremum over all possible sequences of update rules of the minimum information that the bit of memory at epoch (n+1) retains about the previous n inputs. Using only elementary techniques, it is shown that the maximin covariance between the memory at epoch (n+1) and past inputs is Θ(1/n), the maximum average covariance is Θ(1/n), and the maximin mutual information is Ω(1/n^2). In a consideration of related issues, the authors also provide an exact count of the number of Boolean functions of n variables that can be obtained recursively from Boolean functions of two variables, discuss extensions and applications of the original problem, and indicate links with issues in neural computation.

Additional Information

© 1991 IEEE. Manuscript received June 8, 1990; revised March 19, 1991. This work was supported by the Air Force Office of Scientific Research under Grant AFOSR-89-0523. This work was presented in part at the IEEE International Symposium on Information Theory, San Diego, CA, January 14-19, 1990. The authors are indebted to J. Komlós for bringing the problem analyzed in this paper to our attention, and for going through an earlier version of the paper with helpful comments. The authors are also much obliged to A. Barron who showed them the simple convex combination argument using which Theorem 3 follows directly from the proof of Theorem 1.

Attached Files

Published - 00104320.pdf

Files

00104320.pdf
Files (753.5 kB)
Name Size Download all
md5:cc606ba8694d756a465f716ad7fb0869
753.5 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 20, 2023