Replication of prior findings and results is a fundamental feature of science and is part of the logic supporting the claim that science is self-correcting. However, there is little prior research on the methodology for studying replication. Research involving meta-analysis and systematic reviews that summarizes a collection of research studies is more common. However, the question of whether the findings from a set of experimental studies replicate one another has received less attention. There is no clearly defined and widely accepted definition of a successful replication study or statistical literature providing methodological guidelines on how to design single replication studies or a set of replication studies. The research proposed here builds this much needed methodology. This project is funded by the Discovery Research PreK-12 Program, which funds research and development on STEM innovations and approaches.

The goal of this project is to formalize subjective ideas about the important concept of replication, provide statistical analyses for evaluating replication studies, provide properties for evaluating the conclusiveness of replication studies, and provide principles for designing conclusive and efficient programs of replication studies. It addresses three fundamental problems. The first is how to define replication: What, precisely, should it mean to say that the results in a collection of studies replicate one another? Second, given a definition of replication, what statistical analyses should be done to decide whether the collection of studies replicate one another and what are the properties of these analyses (e.g., sensitivity or statistical power)? Third, how should one or more replication studies be designed to provide conclusive answers to questions of replication? The project has the potential for impact on a range of empirical sciences by providing statistical tools to evaluate the replicability of experimental findings, assessing the conclusiveness of replication attempts, and developing software to help plan programs of replication studies that can provide conclusive evidence of replicability of scientific findings.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2018-09-01
Budget End
2021-08-31
Support Year
Fiscal Year
2018
Total Cost
$1,103,188
Indirect Cost
Name
Northwestern University at Chicago
Department
Type
DUNS #
City
Chicago
State
IL
Country
United States
Zip Code
60611