The research objective of this award is to develop a new statistical framework for the design, sampling and modeling of such simulations. Multi-fidelity computer modeling is emerging as a popular method for studying complex systems in engineering. This method combines a large number of approximate simulations with a smaller number of detailed simulations for building prediction models, calibration and optimization. The research will result in tools and techniques which are general enough to apply to a large array of engineering problems in which two computer models are available and one model is more accurate but more expensive than the other. The research effort is focused on the development of a new sampling scheme for estimating the expected values of the outputs of a set of multi-fidelity computer simulations, a new type of statistical design for efficiently running multi-fidelity computer simulations and novel statistical methods for modeling multi-fidelity computer simulations with qualitative and quantitative factors. Challenging real-world problems from the industry and national labs will be used to test and validate the developed results.
If successful, the results of this research will provide engineers a statistics-guided framework for efficiently conducting multi-fidelity computer simulations. Example applications include conceptual design, electronic cooling, hydrology, impact dynamics, material design, nanotechnology, oil reserve management, polymer electrolyte fuel cells manufacturing, thermal dynamics and vehicle multi-body dynamics. Computer simulations are now widely used for solving several pressing issues faced by the U.S. and the world such as climate change, energy conservation and renewable/clean energy innovation. The developed results will potentially enable researchers in these critical fields to use simulations to tackle problems of much larger scales. The research will be disseminated as open source software to directly benefit users of multi-fidelity computer experiments and make a long-term impact. Graduate and undergraduate statistics and engineering students will benefit through involvement in the research and new course offering.
This project has made significant contributions to the design and analysis of computer experiments, with special focuson experiments with multi-fidelity computer codes. Major contributions include: (1) a simple approach to emulation forcomputer models with qualitative and quantitative factors, with application to multistage assembly processes usingintegrated emulation; (2) approachs to constructing Sudoku based space-filling designs and Samurai based spacefillingdesigns, by exploiting three types of uniformity of Sudoku Latin squares; (3) construction of a series of new typesof nested designs including nested orthogonal array based Latin hypercube designs, nested (nearly) orthogonaldesigns, asymmetric nested lattice samples, intended for multi-fidelity computer experiments; (4) a framework forsequential design and analysis of a pair of high-accuracy and low-accuracy computer codes; (5) anexperimental design approach that borrows Latin hypercube designs to reduce the variability of multifold crossvalidation;(6) construction of a new type of space-filling deisng, called a correlation controlled Latin hypercube design, intended fornumerical integration; (7) methods for constructing several new classes of nestedspace-filling designs based on a new group projection and other algebraic techniques; (8) a central limit theorem fororthogonal array based designs that possess stratication in arbitrary multi-dimensions associated with orthogonal arraysof general strength. The effectiveness of these methods have been illustrated using various examples. Material from these methods is also used to enrich several statistics courses at the University of Wisconsin-Madison.