Mathematical models based on probability and statistics are used in a wide variety of contexts, for example in artificial intelligence, finance and operations research, physical sciences, and many others. A central question of interest is how much trust can be put in the predictions of a given model. This is crucial when, as it often happens, there are significant uncertainties associated with the nature of the model itself. In this project, the investigators aim to develop a systematic mathematical framework based on the theory of information to address these issues. One focus of the research will be on prediction of the probability of rare (but potentially catastrophic) events. For example, if a given model predicts a catastrophic event to be a 100-year event, how do uncertainties in the model potentially change this prediction? The investigators will build corresponding stress tests to assess the effects of uncertainties. The research will also provide systematic tools to train new statistical learning models with data and provide performance guarantees.

This research will focus on the development of the probabilistic foundations of uncertainty quantification for complex systems and on related questions about statistical learning. The overarching goals are to provide computable performance guarantees when there is uncertainty in the model itself as well as to develop trustworthy and reliable inference algorithms. Many different metrics and information theoretic measurements are available to compare probability distributions (for example, the Kullback-Leibler divergence); a unifying theme of the project is to determine, in a principled manner, which method is most appropriate to a specific task. In this context, a task consists of extracting information from the model by evaluating certain quantities of interest, such as average values, variance, probability of some rare event, and so on. Using variational principles, new optimal information inequalities will be derived to address these issues. From a robustness perspective, this allows the design of finely tuned stress tests, that is, to build neighborhoods of models around a given baseline model and to compute worst-case scenarios, in the spirit of the stress tests used by financial institutions to protect against sudden changes under alternative scenarios. In the context of statistical learning, and especially approximate inference, central challenges are 1) to select the right divergence to minimize as means to learn probabilistic models, and 2) to provide performance guarantees for the learning process. The investigators will study these questions with emphasis on the case where the quantities of interest are rare (but potentially catastrophic) events. They will assess the impact of model uncertainty on these catastrophic events and on models with heavy tails. The project aims to provide mathematical foundations for performance guarantees in probabilistic algorithms used in a wide array of problems from materials science, to operations research, machine learning, and artificial intelligence. The focus on reliable predictions of extreme and rare events makes the project timely and widely applicable. For example, the robust uncertainty quantification perspective provides worst-case solutions, stress tests, and bias control for safety-critical problems (such as rogue waves in the ocean or power grid failure). Furthermore, probabilistic performance guarantees for approximate inference can make existing black box inference algorithms more trustworthy and transparent in a mathematically systematic manner.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Type
Standard Grant (Standard)
Application #
2008970
Program Officer
Eun Heui Kim
Project Start
Project End
Budget Start
2020-09-01
Budget End
2023-08-31
Support Year
Fiscal Year
2020
Total Cost
$370,000
Indirect Cost
Name
University of Massachusetts Amherst
Department
Type
DUNS #
City
Hadley
State
MA
Country
United States
Zip Code
01035