This research addresses the theory and design of algorithms for an efficient local computation by multiple network terminals of shared functions of all their observed correlated data. Efficient communication among the terminals facilitates efficient computation. Applications include: computing the average, variance, maximum, minimum and parity of observed data in a colocated network of wireless sensors that make correlated measurements. This objective is connected closely to the design of algorithms for the efficient compression of data for storage and transmission purposes, as well as of algorithms for assuring data security. A main goal of the project is to characterize explicitly these connections, thereby leading to the development of new and efficient algorithms for data compression, function computation and network security.

The technical approach involves a formulation of the underlying problems and their analysis, using an information theoretic framework. This will enable the development of a principle of "entropy decomposition of total shared randomness" in a network model to address difficult problems in multiuser information theory of which rate-efficient function computation is a leading example. In particular, an application of source coding algorithms in distributed function computation will be studied. Specific groups of open problems chosen for investigation address a general class of multiterminal models for function computation and data compression. This choice is motivated by, and is of compelling interest to, the theory and engineering practice of network function computation and source coding, as well as network security.

Project Start
Project End
Budget Start
2011-09-01
Budget End
2016-08-31
Support Year
Fiscal Year
2011
Total Cost
$416,188
Indirect Cost
Name
University of Maryland College Park
Department
Type
DUNS #
City
College Park
State
MD
Country
United States
Zip Code
20742