This project concerns the class of Low Density Parity Check (LDPC) codes. LDPC codes come equipped with an iterative message-passing algorithm which operates on a certain bipartite graph associated to the code. The algorithm acts locally on the graph, and this results in both its greatest strength (low complexity) and its greatest weakness (nonoptimality). The PI's focus is on understanding this weakness. Because the algorithm acts locally, it cannot distinguish if it is acting on the graph itself or on some finite unramified cover of the graph. This leads to the notion of "pseudo-codewords", which arise from codewords in codes corresponding to the covers and which compromise the decoder. Thus to understand the performance of LDPC codes, we must understand these pseudo-codewords; most of the problems the PI considers stem from the desire to understand pseudo-codewords of LDPC codes. In previous joint work, the PI has given two characterizations of the pseudo-codewords of an LDPC code: via the so-called "fundamental cone" and via the edge zeta function of a certain graph attached to the code. While the fundamental cone characterization is valid for all LDPC codes, the zeta function characterization is satisfactory only in the special case of cycle codes. The PI will further her study of the fundamental cone and the development of a zeta function characterization for pseudo-codewords of general LDPC codes. Additional targets of study are non-binary LDPC codes and their pseudo-codewords as well as the connections between LDPC codes and another class of graph-based codes: turbo codes.

Whenever information is transmitted across a channel, errors are bound to occur. By adding redundancy to the data, many of these errors can be corrected. If the information is thought of as strings 0's and 1's of fixed length, then the codewords are strings of 0's and 1's of length some longer length, where the difference in lengths represents the amount of redundancy which was added. A collection of codewords is called a code. A large part of classical coding theory is concerned with finding the trade-offs between three fundamental parameters of a code: its length, its number of codewords, and its minimum Hamming distance, i.e., the minimum number of positions in which any two distinct codewords differ. While any code can correct all errors of weight at most rougly half its minimum distance, most codes can correct many errors of substantially higher weight. It is the goal of modern coding theory to find those representations of codes that admit decoding algorithms that allow for correction of all the error patterns that the code can correct --- not only those which have weight at most roughly half the minimum distance. One of the greatest achievements of modern coding theory so far is the discovery and subsequent development of the class of Low Density Parity Check (LDPC) codes. The usefulness of these codes stems from the fact that they come equipped with a very efficient decoding algorithm which operates on a certain bipartite graph associated to the code. The main goal of this project is to further the understanding of the theoretical performance of this decoding algorithm, especially through the study of the so-called "pseudo-codewords" which arise from codes associated to finite covers of the bipartite graph.

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Type
Standard Grant (Standard)
Application #
0602332
Program Officer
Tomek Bartoszynski
Project Start
Project End
Budget Start
2006-07-01
Budget End
2010-06-30
Support Year
Fiscal Year
2006
Total Cost
$147,651
Indirect Cost
Name
University of Nebraska-Lincoln
Department
Type
DUNS #
City
Lincoln
State
NE
Country
United States
Zip Code
68588