Information theory celebrates its 50-th birthday. Although it is a mature area of research by any standard, new challenges arise due to new applications and new theoretical developments. (For example, there is a resurgence of interest in source coding for multimedia applications.) In the 1997 Shannon Lecture Jacob Ziv presented compelling arguments for "backing off" to a certain degree from the (first-order) asymptotic analysis of information systems in order to predict the behavior of real systems where we always face finite (and often small) lengths of sequences, files, codes, etc. One way to overcome these difficulties is to increase the accuracy of asymptotic analysis by replacing first- order analyses (e.g., a leading term of the average code length) by full asymptotic expansions and more accurate analyses (e.g., large deviations, central limit laws). We propose to accomplish this goal by exploring problems of information theory by analytic methods, that is, those in which complex analysis plays a pivotal role. Among others we propose research on lossless Lempel-Ziv schemes, lossy extension of Lempel-Ziv schemes (based on approximate pattern matching), context quantization (which aims at extending context-tree weighting to the lossy environment), prediction schemes based on pattern matching, and hierarchy of redundancy rates. Analytic methods discussed here are: asymptotic analysis of functional-differential equations, poissonization and depoissonization, and complex asymptotics.

Project Start
Project End
Budget Start
1998-07-01
Budget End
2003-06-30
Support Year
Fiscal Year
1998
Total Cost
$292,194
Indirect Cost
Name
Purdue Research Foundation
Department
Type
DUNS #
City
West Lafayette
State
IN
Country
United States
Zip Code
47907