Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2017 | public
Book Section - Chapter

A lower bound on the differential entropy for log-concave random variables with applications to rate-distortion theory

Abstract

We derive a lower bound on the differential entropy for symmetric log-concave random variable X in terms of the p-th absolute moment of X, which shows that entropy and p-th absolute moment of a symmetric log-concave random variable are comparable. We apply our bound to study the rate distortion function under distortion measure |x − x|^r for sources that follow a log-concave probability distribution. In particular, we establish that the difference between the rate distortion function and the Shannon lower bound is at most log(√2e) ≈ 1.9 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log √πe ≈ 1.55 bits, regardless of d. Our results generalize to the case of vector X. Our proof technique leverages tools from convex geometry.

Additional Information

© 2017 IEEE. [AM] Supported by the Walter S. Baer and Jeri Weiss CMI Postdoctoral Fellowship. [VK] Supported in part by the National Science Foundation (NSF) under Grant CCF-1566567.

Additional details

Created:
August 19, 2023
Modified:
October 17, 2023