Published September 1995
| public
Book Section - Chapter
An Inequality On Entropy
- Creators
- McEliece, Robert J.
- Yu, Zhong
Chicago
Abstract
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper-bounded by log m. In this paper, we present a theorem which gives a non-trivial lower bound for H(X). We show that for any discrete random variable X with range R={x_0,…,x_(m-1)}, if p_i=Pr{X=x_i} and p_0⩾p_1⩾…pm_(-1), then H(X)⩾(2logm)/(m-1)Σ_(i=0)^(m-1)ip_i, with equality iff (i) X is uniformly distributed, i.e., p_i=1/m for all i, or trivially (ii) p_0=1, and p i=0 for 1⩽i⩽m-1.
Additional Information
© 1995 IEEE. Date of Current Version: 06 August 2002. This work was support by a Grant from Pacific Bell.Additional details
- Eprint ID
- 29436
- DOI
- 10.1109/ISIT.1995.550316
- Resolver ID
- CaltechAUTHORS:20120223-100956421
- Pacific Bell
- Created
-
2012-02-23Created from EPrint's datestamp field
- Updated
-
2021-11-09Created from EPrint's last_modified field
- Other Numbering System Name
- INSPEC Accession Number
- Other Numbering System Identifier
- 5280076