Wireless sensor networks have the potential to deliver significant new capabilities in various applications ranging from environmental health and safety to homeland security. However, these networks are severely constrained by limited bandwidth and power. Further, in many applications, life-critical anomalies should immediately be detected and acted upon, i.e., system delay should be kept at a minimum. To address these constraints, the investigator studies in detail data processing and compression techniques in a low-delay framework. Utilizing these techniques, higher network throughput and lifetime can be achieved, especially when the sensors are deployed in a one-time fashion and their batteries cannot be replaced.

In contrast with popular recent methods based on turbo and low density parity check codes, this research focuses on efficient distributed source coding algorithms that operate with very low delay. The building block for this purpose is coding schemes that are based on scalar quantization followed by scalar codeword assignment. This scalar coding methodology is then to be extended for protection against channel noise, channel loss, and unreliable sensors that either measure the data incorrectly or fail to function completely, and to be integrated into distributed predictive and transform coding schemes. In contrast with the traditional use of prediction and transforms, the coding performance cannot be maximized by removing all the correlation within each observed data sequence. Thus, a considerable portion of the overall effort is towards understanding and characterizing optimal prediction filters and transforms. Practical implementation issues such as the tradeoff between increased stability versus optimality in prediction filter design are also to be studied.

Project Report

Wireless sensor networks have the potential to deliver significant new capabilities in various applications ranging from environmental health and safety to homeland security. However, these networks are severely constrained by bandwidth and power. In this project, the Principal Investigator and his research team conducted theoretical and experimental research to develop efficient data processing/compression techniques to reduce sensor communication, which, in turn, will result in lower bandwidth and power requirements and longer battery life. Efficient usage of available bandwidth and communication power in wireless sensor networks will lead to higher available throughput from the system and increase the lifetime of the network. That, in turn, will immensely benefit several life-critical missions where the sensors are deployed in a one-time fashion and their batteries cannot be replaced. Unlike the recent methods based on highly efficient but very delay-intensive coding algorithms, the distributed source coding methods to be developed in this project introduces very low delay, and therefore, are more applicable in scenarios involving delay-sensitive information, e.g., where some action must be taken in real time in response to a detected anomaly. The following are more detalied description of the specific project outcomes. We dealt mainly with the basic but important scenario where Sensor A is communicating its measurements with Sensor B, which made its own prior measurements. The measurments are inherently correlated because of the relative proximity of the sensors. 1) With mild assumptions, we showed that the optimal distributed source coding strategy is to apply periodic quantization. That reduces the problem to that of regular non-distributed coding. 2) The developed scalar coding methods were then integrated into distributed predictive and transform coding schemes. In contrast with the traditional use of prediction and transforms, the coding performance cannot be maximized by removing all the correlation within each observed data sequence. Rather, optimal prediction filters and transforms may have to leave some correlation within each sequence for the sake of higher correlation between the prediction error or transform domain signals. In fact, even with the simple model of Gauss-Markov sources, the optimal first order prediction coefficient tends to be much smaller than the time correlation coefficient. This is in contrast with non-distributed coding where the predicition coefficient matches the correlation coefficient. The reason for the aforementioned mismatch is two-fold: First, a low prediction coefficient keeps the correlation between Sensor A and Sensor B intact, and second, it prevents the system from propagating the inevitable error that results from encoder-decoder mismatch. 3) Finally, we developed joint source-channel coding schemes that impose zero-delay on the system. Since linear transmission is no longer optimal even for Gaussian signals in the presence of correlated measurements at the sensors, we developed highly-nonlinear mappings from the source to the channel input directly. These mappings are parametric and when optimized, they outperform the best known alternatives developed by researchers from University of California, Santa Barbara.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Communication Foundations (CCF)
Application #
0643695
Program Officer
Phillip Regalia
Project Start
Project End
Budget Start
2007-07-01
Budget End
2013-06-30
Support Year
Fiscal Year
2006
Total Cost
$400,000
Indirect Cost
Name
University of California Riverside
Department
Type
DUNS #
City
Riverside
State
CA
Country
United States
Zip Code
92521