The gist of the proposal is the development of new Monte Carlo sampling methods, where the density to be sampled is preconditioned by a nested sequence of its marginals. The probability densities of the marginals are to be determined as the sampling proceeds, using an expansion in successive linkages similar to the one used in Kadanoff's real-space renormalization. Two implementations will be explored: in one the target density and a series of its marginals will be sampled in parallel, with occasional swaps among the several parallel computations, relying on the shorter correlation times of the marginals to accelerate convergence; in the other, a single sweep from the smallest to the largest subset with available marginals will be effected, with a correction step based on an assignment of weights; this last implementation will have exactly zero temporal correlation time. At this point it is not clear which of the two may be more efficient, though it is reasonable to assume that this depends on the application. The first application will be a computer study of the three-dimensional Anderson-Edwards near-neighbors spin glass model. The second application will be to filtering and data assimilation for stochastic partial differential equations. A more distant goal is the development of more efficient training techniques for neural networks.

In many problems of physics and of statistics it is necessary to sample complicated probability distributions with a very large number of variables. Current methods often fail because the successive samples they produce fail to be sufficiently independent. The present proposal suggests solving this problem by creating a sequence of successively simpler problems, in such a way that the sampling of each one makes it easier to sample the next harder one; the heart of the proposal is a methodology for making this procedure self-consistent. The first application of the idea, if it is successful, will be to the analysis of a spin glass model; this is a problem of great interest in material science, and as it is known to be very hard to sample, it is a good testing ground for the methods here. The next application will be to data assimilation; this problem arises when one tries to make predictions on the basis of a partial theory and noisy observations, as one often has to do in many fields, for example in weather forecasting or in economics; the difficulty in sampling large arrays of data is often a major roadblock in this type of situation. The spin glass model is closely related to models useful in neural networks and in neurology, and a more distant goal is to use the methods developed here in these exciting areas.

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Application #
0705910
Program Officer
Junping Wang
Project Start
Project End
Budget Start
2007-09-01
Budget End
2011-08-31
Support Year
Fiscal Year
2007
Total Cost
$443,929
Indirect Cost
Name
University of California Berkeley
Department
Type
DUNS #
City
Berkeley
State
CA
Country
United States
Zip Code
94704