A lower bound on the differential entropy for log-concave random variables with applications to rate-distortion theory
- Creators
- Marsiglietti, Arnaud
- Kostina, Victoria
Abstract
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in terms of the p-th absolute moment of X, which shows that entropy and p-th absolute moment of a symmetric log-concave random variable are comparable. We apply our bound to study the rate distortion function under distortion measure |x − x|^r for sources that follow a log-concave probability distribution. In particular, we establish that the difference between the rate distortion function and the Shannon lower bound is at most log(√2e) ≈ 1.9 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log √πe ≈ 1.55 bits, regardless of d. Our results generalize to the case of vector X. Our proof technique leverages tools from convex geometry.
Additional Information
© 2017 IEEE. [AM] Supported by the Walter S. Baer and Jeri Weiss CMI Postdoctoral Fellowship. [VK] Supported in part by the National Science Foundation (NSF) under Grant CCF-1566567.Additional details
- Eprint ID
- 80529
- Resolver ID
- CaltechAUTHORS:20170816-162727008
- Walter S. Baer and Jeri Weiss CMI Postdoctoral Fellowship
- CCF-1566567
- NSF
- Created
-
2017-08-16Created from EPrint's datestamp field
- Updated
-
2021-11-15Created from EPrint's last_modified field