Using extremal entropy properties to characterize the fundamental performance limits of network communication is a tradition of information theory. Most historical successes, however, relied on one particular extremal entropy inequality: the entropy-power inequality of Shannon and Stam, which, though powerful, applies mainly to networks with certain degradedness structure. Moreover, wireless features such as multiple-input multiple-output (MIMO) communications, channel uncertainty incurred by fading, and secrecy constraints due to the broadcast nature of radio communication bring new challenges that cannot be overcome by the entropy-power inequality of Shannon and Stam alone. This situation calls for in-depth investigations of the interaction between converse problems in network information theory and extremal entropy properties in statistics, resorting to powerful statistical tools to solve important communication engineering problems.
The specific goals of this research are: 1) to examine systematic ways of establishing extremal entropy properties through links between information theory and statistics; 2) to establish channel-enhancement as a general framework for solving the converse problems for MIMO downlink communication; and 3) to identify general frameworks for solving the converse problems for collaborative communication in cognitive wireless networks.
Recent years have seen substantial efforts in designing new coding schemes to achieve better performance for wireless networks. Fundamental understanding of the limits of these coding schemes is thus extremely important from the engineering viewpoint to direct future research and to prevent over-engineering and bolster confidence for simple and structured coding schemes. Intellectual results obtained from this research will also be disseminated via course developments on network information theory and wireless communications at Texas A&M and the University of Hawaii.