The proposed research plan combines automatic differentiation (AD) and parallel computing, targeted toward large-scale inverse problems in engineering. AD is concerned with the computation of exact derivatives, up to an arbitrary order. Recent AD software packages are just beginning to bring the practical benefits of AD into the mainstream of computational science. Nevertheless, despite significant progress on the efficient computation of sparse Jacobiian and Hessian matrices, the reverse mode of AD remains impractical for large-scale problems involving dense derivative matrices, primarily due to excessive memory demands. Typically, inverse problems are in this category. The research plan is focused on the efficient automatic generation of gradients for inverse problems in engineering in a parallel computing setting. Reverse mode is in principle optimally efficient for computing gradient vectors; however in practice the memory requirements (and subsequent swapping, etc.) are problematic. The PI's will develop the AD methodology in the parallel computing setting, where the available memory is much greater. Moreover, they will investigate the use of the technique known as recursive checkpointing to greatly reduce memory requirements.

Agency
National Science Foundation (NSF)
Institute
Division of Advanced CyberInfrastructure (ACI)
Type
Standard Grant (Standard)
Application #
9704685
Program Officer
John Van Rosendale
Project Start
Project End
Budget Start
1997-05-01
Budget End
1999-04-30
Support Year
Fiscal Year
1997
Total Cost
$46,200
Indirect Cost
Name
Cornell University
Department
Type
DUNS #
City
Ithaca
State
NY
Country
United States
Zip Code
14850