Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 24, 2007 | public
Book Section - Chapter Open

Normalized Entropy Vectors, Network Information Theory and Convex Optimization

Abstract

We introduce the notion of normalized entropic vectors -- slightly different from the standard definition in the literature in that we normalize entropy by the logarithm of the alphabet size. We argue that this definition is more natural for determining the capacity region of networks and, in particular, that it smooths out the irregularities of the space of non-normalized entropy vectors and renders the closure of the resulting space convex (and compact). Furthermore, the closure of the space remains convex even under constraints imposed by memoryless channels internal to the network. It therefore follows that, for a large class of acyclic memoryless networks, the capacity region for an arbitrary set of sources and destinations can be found by maximization of a linear function over the convex set of channel-constrained normalized entropic vectors and some linear constraints. While this may not necessarily make the problem simpler, it certainly circumvents the "infinite-letter characterization" issue, as well as the nonconvexity of earlier formulations, and exposes the core of the problem. We show that the approach allows one to obtain the classical cutset bounds via a duality argument. Furthermore, the approach readily shows that, for acyclic memoryless wired networks, one need only consider the space of unconstrained normalized entropic vectors, thus separating channel and network coding -- a result very recently recognized in the literature.

Additional Information

© Copyright 2007 IEEE. Reprinted with permission. Posted online: 2007-09-24.

Files

HASitwitwn07.pdf
Files (4.8 MB)
Name Size Download all
md5:2c1ce4b4a394a435ce12b32f5a4d88ac
4.8 MB Preview Download

Additional details

Created:
August 22, 2023
Modified:
March 5, 2024