Decision makers in fields as diverse as business, industry, law enforcement, and military/political intelligence rely on expert forecasts to help make important decisions. The purpose of these forecasts is to communicate information about a target situation in a format that is useful for decision makers. Unfortunately, many expert forecasts are probabilistic in nature and rife with analytic uncertainty. For instance, an analyst may be uncertain about the accuracy and reliability of available evidence, uncertain about how the evidence fits together into a clear picture of the current situation, and uncertain about the likelihood of future developments based on this situation assessment. Surprisingly, there has been relatively little research focused on how best to communicate these forecasts to decision makers. One issue of particular concern is how experts should represent analytic uncertainty and how uncertainty information is understood in the context of supporting narrative information. In some contexts, decision makers have been found to rely heavily on numbers when they should not and in other contexts they appear to be over reliant on verbal or narrative information. In fact, warnings about the over-reliance on particular information sources are captured in many popular sayings (e.g., "Don't believe everything you hear" and "Lies, damned lies, and statistics"). In reality, decision makers rely on both numerical and verbal information to the extent that they perceive the information to be diagnostic and accurate.

This research implements nine experiments that examine the best ways of presenting analytic uncertainty, to assess the reasoning strategies used by decision makers, and to understand how individual differences in numeracy and cognitive style affect the use of expert forecasts. With a better understanding of how decision makers reason about and use expert forecasts, we can identify ways to improve communication between experts to decision makers. For instance, these findings will allow us to identify the best ways of formatting narrative and numerical uncertainty so that decision makers can make optimal use of this information. These findings will have broad applications for communicating risk and probabilistic forecasts in many different domains.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
0925008
Program Officer
Robert E. O'Connor
Project Start
Project End
Budget Start
2009-09-15
Budget End
2012-08-31
Support Year
Fiscal Year
2009
Total Cost
$279,541
Indirect Cost
Name
Decision Science Research Institute
Department
Type
DUNS #
City
Eugene
State
OR
Country
United States
Zip Code
97401