Systematic methods are needed to quantify uncertainty in nanomanufacturing processes, and subsequently to design processes that are robust to these uncertainties. Nanoscale phenomena present a new challenge for manufacturing, due to the inherent stochastic dynamics, in addition to sensitivities to macroscopic process inputs like temperature and pressure. However, processes are currently being developed to take advantage of the new discoveries and advancements in nanoscience, and cost-effective engineering approaches and tools are needed to more efficiently explore the design space to develop nanotechnology-enabled products. In this work the PIs focus on the synthesis of metal nanoparticles, which are used in a wide range of applications from energy to medicine. For example, nanoparticles of controlled size and size distribution are needed to create high performance catalysts for NOx treatment in diesel engines, which produce lower CO2 emissions relative to gasoline engines. However, developing a high-throughput manufacturing process to create durable supported catalysts in a cost-effective manner has been elusive, in part due to design tradeoffs like higher performance but lower durability at smaller nanoparticle size. Moreover, significant variability exists both within a single batch of nanoparticles, due to the inherent distribution of particle nucleation times, and also between batches, due to drift in operating conditions and noise variables.

This project is a comprehensive methodology for robust optimization of a batch process, using various sources of information integrated by a rigorous Bayesian method. First, mechanistic models of mean process behaviors, as is common in the engineering disciplines, will be developed. Since models of nanoscale phenomena are typically not accurate within manufacturing tolerances, mechanistic models will be supplemented with stochastic components linking within- and between-batch variations to controllable process parameters and noise variables for robust process design. Expert opinions help model trends and expected variance for upgrading the models into a stochastic-mechanistic simulation tool. The simulated data generated will be used to build a statistical-mechanistic model, which is less complex than the simulation model, suitable for efficient exploration of process recipes. Then, physical data will be collected based on optimal experimental design plans developed to validate and improve the statistical-mechanistic model. Finally, the refined model will be used to cost-effectively search for the optimized process recipe, to achieve the desired nanoparticle size with a narrow size distribution while minimizing batch-to-batch variation.

Intellectual merit. The current disconnect between the fields of robust design in statistics and mechanistic modeling in engineering will be bridged by this methodology. Incorporating all sources of information on mean behavior and variance requires domain-specific knowledge and mechanistic understanding. This modeling approach for mean and variance of process variables is required to derive the recipe for a robust optimal process.

Broader impact. The PI team is uniquely equipped to develop this new methodology for robust process optimization. They combine expertise in experiments, mechanistic modeling, process control, and experimental design, along with our close collaborations with industry. The diverse team of faculty and students (graduate, undergraduate, and high school) who will participate in the project will gain experience and insight that will allow them to work in interdisciplinary nanomanufacturing environments.

Project Report

Designing a process for a new advanced technology is challenging, because the process is not well understood yet and performing experiments is often costly–in time and money. The engineer initially performs a set of experiments to learn about the process, and then the engineer uses the knowledge gained to perform additional experiments, now at modified conditions that should yield a more economical process. This research program focused on high-pressure nanoparticle deposition processes. New statistical methods were developed in the context of this application to design efficient experiments, guiding the engineer toward the best processing conditions with a minimum number of experiments. The new methods for experimental design treat two distinct scenarios. The first is the case when no experiments have been performed yet. In this case, expert knowledge can still be used to design the first batch of experiments, bringing in hypotheses as well as past data on similar systems. In the second case, an initial set of experimental data is already available, and the goal is to design new batches of experiments sequentially, each time learning from the past data to zoom in on the optimal process conditions. In nanoparticle deposition, the process conditions include temperature, pressure, concentrations of multiple compounds, and the type of wafer and its surface treatment. The goal is to achieve a desired nanoparticle size distribution, tailored to the desired application. Two different material systems and applications were studied in this project: silver nanoparticles for chemical sensing, and copper-zinc-tin-sulfide nanoparticles and films for solar energy applications. In both cases, carbon dioxide at high pressure was used to enable high fluid concentration of the deposition species. The carbon dioxide also replaces organic solvents that are environmentally unfriendly. This type of high-pressure process has not been heavily studied in the past, and the chemical mechanisms are not well understood, so it is a good case study for our experimental design methods for advanced technology. The new experimental design methods for initial experimental design were implemented in experiment, taking past information from the silver nanoparticle deposition process. Four experts in the field were surveyed, giving predictions on iridium nanoparticle deposition for an otherwise similar process. A unified statistical model was constructed based on the survey data, and this was used to design a set of six experiments. The experiments had various temperatures, with the goal of finding the temperature to achieve an iridium nanoparticle size of 40 nm. The method did perform as intended, clustering experiments near the desired iridium nanoparticle size. However, an unexpected new phase also appeared: larger structures composed of smaller particles. The appearance of a new structure was not addressed by the survey questions, and highlighted the importance of survey design in this approach. The method for sequential experimental design was demonstrated using the silver nanoparticle process. Based on an initial set of six experiments, multiple models were fit to the data to describe the relationship between temperature and mean nanoparticle size. A second batch of six experiments was designed, balancing the need to spread out experiments with the need to cluster them near the desired nanoparticle size of 40 nm. This batch of experiments did span the desired nanoparticle size, but did not predict the size within the desired level of confidence, 5 nm. Thus, a third batch of six experiments was designed, further focused around the target size, and after this data was collected, the mean size was predicted with the desired level of confidence.

Project Start
Project End
Budget Start
2009-09-01
Budget End
2013-08-31
Support Year
Fiscal Year
2009
Total Cost
$415,011
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332