When faced with a choice among alternative theories compatible with current experience, scientists, standard statistical techniques, and the most recent data-mining software all tend to side with the simplest theory, where simplicity has something to do with minimizing independent entities, principles, causes, or equational coefficients. This systematic bias toward simplicity has a name: Ockham's razor.

There are numerous explanations of the appeal of simple theories: they are pretty, unified, symmetrical, explanatory, severely testable, reduce the variance of empirical estimates, and are more probable with respect to of-the-shelf prior probability measures. What has proven much harder to explain is how favoring simple theories advances the search for a correct theory, compatible with all future observations. Of course, if the totality experience were as simple as possible in light of current experience, Ockham's razor would point at an empirically adequate theory, but that is a circular appeal to Ockham's razor: What competent shaman's oracle fails to insist upon its own reliability? Indeed, every standard account of simplicity either fails to address the preceding, obvious, concern or does so via a direct or tacit appeal to circularity.

This proposal concerns a new, non-circular justification, according to which Ockham's razor does not point at or indicate an empirically adequate theory but, rather, keeps inquiry on the straightest possible route to such a theory. 'Straightness' of convergence is measured according to diachronic cognitive costs prior to convergence to the true theory, such as the number of times the empirical method reverses its opinion as the data accumulate, the lateness of occurrence of these reversals of opinion, and the total number of times a false theory is produced.

Some rigorous optimality theorems of the sort just described have already been established. The proposal is to generalize these results in various essential dimensions, to study more deeply their concrete consequences for practice, both contemporary and historical, and to perform diagnostic computer simulations to study the convergent efficiency of standard statistical and machine learning strategies for model selection. The proposed work, therefore, involves a balanced mix of conceptual refinement, mathematics, methodology, and simulation studies based on computer-generated random data.

The intellectual merit of developing a new, truth-directed explanation for one of the outstanding mysteries concerning scientific method is evident. No question could be more generally or deeply relevant to the mission of the National Science Foundation. Regarding broader impact, Ockham's razor is as much a puzzle for scientists, science students, and the broader public as it is for the philosophy of science. The proposed explanation is simple and inspiring enough, in its outlines, to serve as a standard, textbook explanation of the role of simplicity in science.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Application #
0750681
Program Officer
Frederick M Kronz
Project Start
Project End
Budget Start
2008-03-15
Budget End
2011-02-28
Support Year
Fiscal Year
2007
Total Cost
$103,503
Indirect Cost
Name
Carnegie-Mellon University
Department
Type
DUNS #
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213