The research team investigates several critical and previously unexplored issues related to the cognitive processing and communication of weather forecast uncertainty. This project takes a novel experimental approach, directly comparing weather-related decisions made with and without uncertainty forecasts. The goal is to determine the circumstances under which uncertainty estimates are advantageous, despite the fact that they may result in decisions that are suboptimal from a rational perspective. These studies use actual forecasts and forecast expressions in the context of complex, realistic decision tasks, testing specific advantages of uncertainty forecasts, such as individualized decision making and increased trust in the forecast. The project makes important theoretical contributions by exploring the psychological processes that underlie "risk seeking" choices, the relationship between error in the forecast and loss of trust, and the tendency to "simplify" uncertainty forecasts by misinterpreting them as deterministic quantities.

Many decisions with important economic and safety consequences (such as whether to protect crops against frost damage or whether to evacuate communities threatened by floods or hurricanes) are based on forecasts that are inherently uncertain. Although it is now possible to assess that uncertainty, little of this information reaches the end user. This is due in part to the fact that it was previously unknown whether lay users could make good use of uncertainty information. This project thereby has practical contributions by exploring methods for communicating forecast uncertainty to overcome problems. For instance, it tackles the largely untested question of whether visualization actually helps people to understand uncertainty. The practical contributions described here are significant especially in context of weather warnings, as the nation faces the consequences of climate change.

Project Report

This project investigated everyday users’ understanding of numeric uncertainty information (e.g. 30% chance of nighttime low temperature below freezing) and how it informs decision-making. In general, numeric uncertainty information allowed people to make better weather-related decisions compared to decisions based on "deterministic" forecasts, implying a single outcome (e.g. nighttime low temperature will be 32ºF). A series of experimental studies demonstrated that people understood several forms of numeric uncertainty information including, probabilities at a given threshold, as in the example above, predictive intervals which provide a range of values (e.g. temperatures) between which the observed value is expected with specified probability (e.g. 80% chance that the nighttime low temperature will be between 28 and 32 degrees) and odds ratios which compare the odds of a weather event on a particular date to climatological odds (subzero temperatures are 6 times more likely tonight than on a typical winter night). Although advantages were found for all of these expressions, evidence suggested that there are some situations in which specific expressions might be better. For instance, with rare but destructive events, odds ratios tend to encourage people to take precautionary action whenever the odds are elevated. These experiments demonstrated that people understood numeric uncertainty estimates in the sense that they could use the information to better discriminate between situations that did and did not require precautionary action. This is not to say that study participants made economically optimal decisions or that they could have provided an explanation of probability theory, if we had asked them to do so. Nonetheless their decisions improved when forecasts included an indication of the amount of uncertainty involved. Other benefits of numeric uncertainty information included identifying situations with greater uncertainty and maintaining greater trust in the forecast. Thus, this research suggests that in some cases more information is better than less. Indeed the advantages for uncertainty forecasts were not diminished as decision complexity increased. Thus, concerns about information overload, at least within the boundaries tested in these experiments, are ill founded. We did note one error in interpretation that was related to visualizations of the predictive interval: People thought predictive interval visualizations depicted diurnal fluctuation, i.e. a deterministic forecast describing the range of temperatures over the course of the day. A number of different visualizations were tested with the same results. We attributed this misinterpretation to a general psychological tendency to regard forecasts as deterministic if there is any opportunity to do so. Visualizations appear to provide such an opportunity, perhaps because people assumed they understood the graphic without bothering to read the key. When the visualization was removed—the error virtually disappeared. We concluded that there are some situations in which visualizations do more harm than good, particularly when uncertainty is involved. The only expression that was not well understood by study participants was the "return period" (100-year flood). Results suggested that people thought it meant that a single instance of the event was expected over the period described. They anticipated lower likelihood of the forecasted event (e.g. flood) if a similar event had recently occurred and a higher likelihood if it had not. However, participants using a probabilistic expression (1% chance per year) realized that there was an equal likelihood regardless of the recency of a similar event, eliminating the bias. In sum, we found that the majority of numeric uncertainty expressions were well understood and remarkably helpful to non-expert end users. It is important to note that these advantages were observed in situations in which no special training or explanations were provided. Indeed, information was presented in a simple format that could be used in a web-based display. Moreover, studies suggested that the advantages for numeric uncertainty estimates increased as error in the single value forecast increased and were maintained in the face of specific errors such as misses and false alarms. We believe that this is because people have intuitions about the uncertainty inherent in weather forecasts and therefore have greater trust in a forecast that acknowledges the uncertainty up front. In addition, this research suggests that people are more likely to comply with specific advice (such as evacuation) when it includes an uncertainty estimate. As a result, this research has important implications for weather warnings. It suggests that it is better to provide accurate and truthful information to the public than to overstate the case as is often done in emergencies. People are more likely to believe it and take appropriate precautions.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
1023354
Program Officer
Robert E. O'Connor
Project Start
Project End
Budget Start
2010-09-15
Budget End
2014-08-31
Support Year
Fiscal Year
2010
Total Cost
$369,976
Indirect Cost
Name
University of Washington
Department
Type
DUNS #
City
Seattle
State
WA
Country
United States
Zip Code
98195