Most coding schemes developed in network information theory are combinations of a handful of basic components using Shannon's random coding technique. With the goal of advancing our understanding of these random coding schemes and making them applicable in practice, this research explores three important problems in network information theory. First, this project investigates the simultaneous decoding rule that is easy to analyze, yet powerful enough to achieve the maximum achievable rates of random coding over interference channels. Second, this project applies the insights thus gained on random coding and simultaneous decoding to the index coding problem, in which multiple messages are communicated through a single, noise-free link to multiple receivers with different pieces of side information. Third, this project develops a concatenated coding architecture based on random coding, product codes, and iterative decoding that can provide a systematic method for translating random coding schemes to practical, implementable coding techniques for real-world networks.
Playing an ever-increasing role in our networked society, network information theory studies the fundamental limits on information flow over networks and the optimal coding techniques, protocols, and architectures that achieve these limits. This research investigates canonical problems in network information theory that involve interference and broadcast, offering fresh insights and new mathematical tools for optimal information flow in several important applications such as network coding, wireless communication, peer-to-peer networking, and content broadcasting. The concatenated coding architecture developed in this research has potential to provide a new framework for transforming theoretical concepts in network information theory into practical algorithms for applications.