The goal of this project is to develop foundations for message-passing algorithms, in the realm of probabilistic graphical models. The project will focus on three classes of problems in graphical models: (a) learning graphical models from observed data, (b) computing the mode of a graphical model, and (c) computing marginals of variables in a graphical model. The project will identify the strengths and limitations of the popular belief propagation algorithm for all three problems. In addition, the project will develop new class of efficient message-passing algorithms with provable performance guarantees by means of exploiting the geometry of the graphical model.

Probabilistic graphical models have become a standard way to represent uncertainty succinctly in a wide variety of applications: communication and signal processing, computation, vision and image processing, bioinformatics, natural language processing, and more. The problems faced in these applications are largely

The research outcome of this project will be folded in the course titled "Algorithms for Inference (6.438)" taught by the PI. The course is a popular entry level graduate course which also disseminates material online through MIT Open CourseWare and going forward, it may explore possibility of wider dissemination through MITx/EdX.

Project Report

In the modern times, access to a very large amounts of data has become possible. In order to make effective decisions using data, computation that needs to be performed requires solving complex problems. In many scenarios, the problem boils down to performing computation using `network structured' data. Ease of access to vast amount of computation resources, a large amounts of computation can be performed on this data. However, this computation has to be `localized' with respect to the `network structure' of data as all of the data can not be access simultaneously by one computational unit. To address this timely challenge, this project has developed `local' computation algorithm for estimating the so-called stationary probability distribution of large Markov Chains. This, in effect, is the problem that needs to be solved for searching the world-wide-web effectively. Intellectually, this work has advanced the state-of-art of local computation algorithm as well as efficient method for solving stationary distribution of large Markov Chain. It has also advanced the understanding of the `message-passing' algorithms, an emerging paradigm for computation in the presence of large amounts of data. The problem of computing stationary distribution for large Markov chain has been central to a large number of disciplines including Statistics and Mathematics, Computer Science and Electrical Engineering, Operations Research, Experimental Physics, Economics etc. In that sense, this work is likely to have a broad impact.

Project Start
Project End
Budget Start
2012-07-01
Budget End
2014-06-30
Support Year
Fiscal Year
2012
Total Cost
$110,000
Indirect Cost
Name
Massachusetts Institute of Technology
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02139