Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 12, 2009 | Published
Book Section - Chapter Open

On the Entropy Region of Discrete and Continuous Random Variables and Network Information Theory

Abstract

We show that a large class of network information theory problems can be cast as convex optimization over the convex space of entropy vectors. A vector in 2^(n) - 1 dimensional space is called entropic if each of its entries can be regarded as the joint entropy of a particular subset of n random variables (note that any set of size n has 2^(n) - 1 nonempty subsets.) While an explicit characterization of the space of entropy vectors is well-known for n = 2, 3 random variables, it is unknown for n > 3 (which is why most network information theory problems are open.) We will construct inner bounds to the space of entropic vectors using tools such as quasi-uniform distributions, lattices, and Cayley's hyperdeterminant.

Additional Information

© 2008 IEEE. Issue Date: 26-29 Oct. 2008;Date of Current Version: 12 June 2009. This work was supported in part by the National Science Foundation through grant CCF-0729203, by the David and Lucille Packard Foundation, by the Office of Naval Research through a MURI under contract no. N00014-08-1-0747, and by Caltech's Lee Center for Advanced Networking.

Attached Files

Published - Shadbakht2008p80662008_42Nd_Asilomar_Conference_On_Signals_Systems_And_Computers_Vols_1-4.pdf

Files

Shadbakht2008p80662008_42Nd_Asilomar_Conference_On_Signals_Systems_And_Computers_Vols_1-4.pdf

Additional details

Created:
August 20, 2023
Modified:
March 5, 2024