This research aims to develop an optimization-based approach to network information theory, that goes well beyond current networking theory and practice. There is a great deal of recent interest in the problem of simultaneous information transmission among many users over wired and wireless networks. Information theory is well poised to have an impact on the manner in which such future networks are designed and maintained, both because wired networks are ripe for applications such as network coding (where information streams are actually combined rather than simply routed) and also because wireless networks cannot be satisfactorily dealt with using conventional networking tools. The challenge is that even the simplest network information theory problems are notoriously difficult and, as a result, information theory has not been able to provide many tools to network practitioners. The research aims to remedy this situation by developing tools for more effective network design.
While, in principle, it is possible to obtain the information-theoretic rates in wired networks via convex optimization over the space of entropy vectors, this effort is severely hampered by the fact that an explicit characterization of the entropic space does not appear to be within reach. To circumvent this, the research will consider frameworks that, while possibly suboptimal, apply to arbitrary networks, have reasonable complexity and lend themselves to distributed implementation. The mathematical approach taken is four-fold and makes use of the representation theory of matroids (to design linear network codes), Monte Carlo Markov chain methods to distributedly design "good" network codes, group-theoretic techniques to construct nonlinear network codes from non-Abelian groups, and determinantal inequalities to study the entropic space.