Although comprehension is well recognized as a critical component of the survey question response process, much about it remains unknown. Past research has shown that ambiguous or vague concepts can be clarified through the use of definitions, instructions, or examples, but respondents do not necessarily attend to these clarifications. The aim of this doctoral dissertation research project is to investigate where and how to present clarifying information so that respondents will recognize it as essential to their answering survey questions correctly. A key issue is whether sensory channels (aural versus visual) make different demands on comprehension. The answer to this question is largely unknown, because sensory channel is often confounded with the presence of an interviewer in so many of the relevant studies. This project will help to establish whether respondents anticipate the end of a question and are more likely to interrupt clarifying information that is placed after a question than before and whether this harms survey estimates. It will help to confirm whether incorporating the clarifications into the questions and asking a series of simpler questions, as suggested by many researchers, is even more effective. Finally, understanding will be gained regarding whether respondents are better at comprehending complex clarifications in the visual channel than the aural, and whether channel interacts with the method of clarification.

The goal of this project, to gain a better understanding of how to reduce ambiguity and vagueness in survey questions across survey modes, is especially relevant given the current debate over how to design questions for mixed-mode surveys. This project has the potential to lead to more accurate survey estimates, to better descriptions of the nation based on survey data, and to better decision-making policies. As a Doctoral Dissertation Research Improvement award, support is provided to enable a promising student to establish a strong independent research career. The project is supported by the Methodology, Measurement, and Statistics Program and a consortium of federal statistical agencies as part of a joint activity to support research on survey and statistical methodology.

Project Report

Researchers Study Better Ways of Asking Questions for Federal Surveys Each year millions of Americans are asked important questions in federal surveys that are meant to measure and monitor key characteristics of the nation. For example, the decennial census asks respondents how many people are living or staying in their household. The NSF is interested in finding out how many graduate students were engaged in science and engineering activities. Such questions often require further clarification (in the form of definitions, instructions or examples) for respondents to understand them correctly. However, it is unclear how to effectively provide this clarification. This project investigated whether clarifying information should come before or after a question or whether it should be incorporated into a series of questions instead. In addition, surveys are increasingly mixing modes of administration because some people prefer to read and answer the survey for themselves (for example, in Web surveys), while others prefer to listen to the questions being read to them (for example, in telephone surveys). Thus, we also investigated whether clarifying information should be posed differently for readers and listeners. We found that in both modes of administration, respondents were least likely to follow the clarifications when they were presented after the question, more likely to follow them when they were placed before the question, and most likely to follow them when they were incorporated into a series of questions. An important implication of this finding is that the practice of asking multiple questions in the aural mode of a survey, but a single question in the visual mode, which is what some surveys do presently, may not be the most effective approach; instead the questions should be posed similarly in both modes. Ultimately, this research should lead to more accurate survey estimates, to better descriptions of the nation based on survey data, and to better decision making based on surveys.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
1024244
Program Officer
Cheryl Eavey
Project Start
Project End
Budget Start
2010-09-01
Budget End
2011-08-31
Support Year
Fiscal Year
2010
Total Cost
$12,000
Indirect Cost
Name
University of Maryland College Park
Department
Type
DUNS #
City
College Park
State
MD
Country
United States
Zip Code
20742