The interplay between information theory (IT) and computer science (CS) dates back to the founding father of information theory, Claude E. Shannon. Ever since Shannon's work on both information theory and computer science, the research in the interplay between IT and CS has continued and expanded in many exciting ways. In 2003 the first NSF sponsored Workshop on the IT and CS Interface was held in Chicago, while in 2004 a graduate course on analytic methods in information theory and analysis of algorithms was organized at MSRI, Berkeley. We build on this momentum and propose to work on problems of information theory, combinatorics, and analysis of algorithms. Following Knuth's and Hadamard's precept, we study such problems using techniques of complex analysis. This program, which applies complex-analytic tools to information theory, constitutes ``analytic information theory''.
This research is focused on some facets of source coding, such as the redundancy rate problem, method of types, entropy evaluation, channel capacity, and joint channel-source coding. The redundancy rate problem for a class of sources is the determination of how far the actual code length exceeds the optimal (ideal) code length, while the method of types is a powerful technique in information theory, large deviations, and analysis of algorithms. It is argued that counting types can be accomplished efficiently by enumerating Eulerian paths (Markov types) or binary trees with a given path length (universal types). Likewise, analysis of the redundancy rate problem for memoryless and Markov sources leads us to interesting generating functions such as tree generating functions (e.g., arising in counting labeled rooted trees), which are studied extensively in computer science.