Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 1995 | public
Book Section - Chapter

An Inequality On Entropy

Abstract

The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper-bounded by log m. In this paper, we present a theorem which gives a non-trivial lower bound for H(X). We show that for any discrete random variable X with range R={x_0,…,x_(m-1)}, if p_i=Pr{X=x_i} and p_0⩾p_1⩾…pm_(-1), then H(X)⩾(2logm)/(m-1)Σ_(i=0)^(m-1)ip_i, with equality iff (i) X is uniformly distributed, i.e., p_i=1/m for all i, or trivially (ii) p_0=1, and p i=0 for 1⩽i⩽m-1.

Additional Information

© 1995 IEEE. Date of Current Version: 06 August 2002. This work was support by a Grant from Pacific Bell.

Additional details

Created:
August 20, 2023
Modified:
October 24, 2023