The investigator will develop computational methods that seek to find optimal decisions when not all information is known exactly. This situation arises when future events cannot be predicted with high accuracy or when quantities can only be estimated from limited data. A similar setting occurs when computer programs are employed to simulate processes. In that case, the computer output can be noisy, due to limited precision of the underlying numerical algorithms or due to randomness inherent in the simulation. An illustrative example is the deployment of solar or wind energy: Electricity demand estimates based on past observations are uncertain, and weather forecast models are started from random perturbations of the initial conditions. Computational optimization algorithms exist that address data uncertainty, but current methods are not able to handle difficult nonlinear relationships in the mathematical optimization model. This research will overcome this limitation on two fronts. The first research project will result in optimization algorithms for problems in which constraints need to be satisfied only with a given probability. These methods will permit a much wider range of these constraints than the present state-of-the-art. The second project will produce methods that optimize computer simulations by explicitly addressing the nature of the output noise. In contrast to existing approaches, these algorithms will not stagnate at spurious solutions induced by the noise. All new methods will be implemented in software and evaluated on real-life problems, and new mathematical theory will be developed that proves the convergence of these methods.

In this project, new algorithms for continuous chance-constrained optimization will be developed. In current approaches, the objective and constraint functions are required to be linear or convex, and the nonconvexity induced by the chance-constraints is handled either by conservative convex approximations or by the global solution of discrete formulations via combinatorial branch-and-bound enumeration. The new methods will permit problem statements that involve nonlinear and nonconvex objective functions and include joint chance constraints with nonconvex probabilistic constraints. This is made possible by seeking only local optima, which can be found more easily than global minima but are still highly valuable in practice. As a result, established techniques from nonconvex nonlinear optimization can be built upon and extended. The new sequential quadratic chance-constrained programming framework requires the introduction of new chance-constrained trust-region subproblem solvers and convergence theory for chance-constrained penalty functions which will be developed in this project. The PI will also develop a derivative-free optimization method for objective functions with deterministic noise caused by numerical error in computer simulations. The approach is based on a smoothed objective function obtained via convolution with a Gaussian kernel. The integral in the new objective is approximated by Monte-Carlo sample average approximation. Adaptive multiple importance sampling permits the reuse of the expensive function evaluations computed in all previous iterations.

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Type
Standard Grant (Standard)
Application #
1522747
Program Officer
Leland Jameson
Project Start
Project End
Budget Start
2015-09-15
Budget End
2019-08-31
Support Year
Fiscal Year
2015
Total Cost
$210,000
Indirect Cost
Name
Northwestern University at Chicago
Department
Type
DUNS #
City
Chicago
State
IL
Country
United States
Zip Code
60611