This research studies network information theory based on the viewpoint of entropic vectors and convex optimization. There is currently great interest in the problem of information transmission over wired and wireless networks. Information theory is well poised to have an impact on the manner in which future networks are designed and maintained, both because wired networks are ripe for the application of network coding and also because wireless networks cannot be satisfactorily dealt with using conventional networking tools. The challenge is that most network information theory problems are notoriously difficult and so the mathematical barriers that must be overcome are often quite high.
The approach adopted in this research is through the definition of the space of normalized entropic vectors, which differs slightly from that in the literature in that entropy is normalized by the logarithm of the alphabet size. This definition is more natural for determining the capacity region of networks and renders the closure of the resulting space convex (and compact), even under constraints imposed by channels internal to the network. For acyclic memoryless networks, the capacity region for an arbitrary set of sources and destinations can be found by maximizing a linear function over the set of channel-constrained normalized entropic vectors and some linear constraints. While not necessarily making the problem simpler, this approach certainly circumvents the ``infinite-letter characterization'', as well as the nonconvexity of earlier formulations, and exposes the core of the problem as that of determining the space of normalized entropy vectors. Much of the research therefore focuses on constructing computable inner and outer bounds to this space using tools from group theory, lattice theory, non-Shannon inequalities, and others.