The past 30 years have seen major advances in many aspects of hurricane forecasting, but there has been little systematic research on the way coastal populations interpret the weather information that is communicated to them. To date, most evaluations of hurricane information have comprised reaction criteria (asking whether potential users like a display) rather than learning (testing whether users understand a display) or performance (whether a display changes users? decisions) criteria. However, there is a growing body of anecdotal evidence that many people misunderstand the displays meteorologists are providing. To better understand how people interpret hurricane forecasts and the uncertainties in those forecasts, this research will systematically examine the cognitive processes involved in hurricane tracking by conducting an evaluation of existing and novel hurricane information displays. The first task will assess the ways in which users interpret three basic elements of storm track information?the trailing track (where the storm has been), the forecast track (where it is most likely to go), and track uncertainty (how likely it is to deviate from the forecast track). Participants in different experimental conditions will observe simulated hurricanes described by these three basic elements. Some participants will be given information about only one basic element (e.g., forecast track only), others will be given information about two basic elements in combination (e.g., forecast track and uncertainty cone), and some will be given information about all three basic elements. By comparing participants in these information conditions, the research team will be able to gain insight into how each of the three basic elements affects people?s expectations about storm tracks over time. The second task will focus on the third basic element of hurricane track information (track uncertainty) by comparing a conventional uncertainty cone with five alternative track uncertainty displays?numeric probabilities, color-coded probabilities, terrain coded probabilities, arrow glyphs, and dynamic tracks. Comparing the data from these six information conditions will allow us to determine if any of the alternative displays provides a better way of conveying track uncertainty. Finally, the third task will use the findings from the first two tasks to design and evaluate new ways of visualizing storm information.
The proposed research will provide a rigorous assessment of the cognitive processes involved in hurricane tracking. Accordingly, it has implications for the cognitive psychology (especially judgment and decision making) of complex dynamic tasks. In addition, the project will have implications for instruction because there is very little research that addresses the problems of training adults to perform rarely performed, but critical, judgment tasks such as hurricane tracking. The project will provide meteorologists with a better understanding of the ways in which people interpret hurricane forecasts and the uncertainties in those forecasts. This improved understanding will allow them to communicate more effectively with coastal populations and reduce the probability that lives will be lost in hurricanes that deviate from their forecast tracks.
This project was aimed at making progress towards enhancing public safety and promoting better emergency response to hurricanes. We looked specifically at how public officials and other respondents use predictive information in deciding on appropriate responses to an impending hurricane event. While this report looks specifically at the work conducted at Clemson, the project was actually a collaboration across two universities. Texas A&M was responsible for investigating the human response factors. Clemson was responsible for developing software tools for use in human studies, and exploring new was of using computer graphics to enhance hurricane prediction displays. The Clemson team developed a new web-based software tool, called DynaSearch, for use in developing and running experiments designed to observe how people make use of various types of information in making decisions about how to properly respond to an approaching hurricane. The Texas A&M team has now used this software in a laboratory environment, and are preparing a web-based field study using this new tool. DynaSearch is now accessible over the web and, after sufficient field testing, will be made available for general use. The Clemson team also developed and tested a new hurricane track prediction visualization tool, called the Track Forecast Ensemble display that is meant to supplement the familiar National Hurricane Center "error cone", often seen in media weather forecasts. Instead of presenting a hurricane prediction as a single track surrounded by an "error cone", this new method attempts to give a clearer idea of the uncertainty underlying a multiple day hurricane prediction. To do this, it continuously updates a computer display with possible hurricane tracks. These tracks overlay each other, and fade out with time, so the net effect is that the viewer sees a strongly drawn-over region of tracks where the hurricane is most likely to go, but also sees many possible tracks that deviate from the prediction. Our experiements have shown that, using our method, people are just as accurate in determining the most likely path for the hurricane as they are with the "error cone". But, they also have a stronger appreciation for the likelihood that the storm could deviate significantly from this prediction. Overall, the project has the potential to influence the way information about an oncoming hurricane is presented, and to affect the way that public officials are trained to make appropriate response decisions.