The rapid acceptance of the Worldwide Web as a vehicle for survey data collection raises important questions about how the new method works. Key features of Web surveys include the use of rich visual presentation of questions and the capability of interaction with the respondent. The rapid growth of the Web makes a close examination of these issues even more urgent. Neither set of features has been explored thoroughly even with earlier modes and the Web offers widely increased resources for both visual display (Web questionnaires can readily incorporate still pictures or video clips) and interaction (such as, floating screens and scrolling for help with definitions). Our application outlines a set of studies designed to address key questions about these issues. The studies focus on Web surveys, but we believe that the results would generalize to other modes of data collection that rely on visual presentation or incorporate interactive design features. ? Experiments 1-5 examine how respondents interpret the visual cues in Web questionnaires. These studies test the general proposition that incidental features of the presentation of the questions (for example, the spacing of the response options, the color assigned to different response options) can give rise to unintended inferences about their meaning. These studies test predictions derived from a theoretical framework that assumes respondents use simple interpretive heuristics to assign meaning to visual features of the questions. The next two experiments examine the effects of including images as a supplement to the text of the question. Images are necessarily concrete, and Experiment 6 tests the hypothesis that this concreteness may lead respondents to interpret the questions more narrowly when they are accompanied by images. Experiment 7 tests the idea that the item depicted in an image may serve as a standard of comparison for respondents' judgments. Again, the results of these studies will lead to practical guidelines about the dangers involved in using images as an adjunct to verbal questions. The final series of studies examines when respondents are likely to take advantage of interactive features of a questionnaire. These experiments test three general hypotheses; respondents are more likely to utilize the information available to them interactively when 1) the information is easy to obtain, 2) it is clearly helpful, and 3) respondents are highly motivated to seek help. These six experiments would yield a better understanding of methods for getting respondents to use features that could yield better survey data. ? ?

Agency
National Institute of Health (NIH)
Institute
Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD)
Type
Research Project (R01)
Project #
5R01HD041386-02
Application #
6743701
Study Section
Special Emphasis Panel (ZRG1-RPHB-4 (01))
Program Officer
Bachrach, Christine
Project Start
2003-05-01
Project End
2006-04-30
Budget Start
2004-05-01
Budget End
2005-04-30
Support Year
2
Fiscal Year
2004
Total Cost
$201,780
Indirect Cost
Name
University of Michigan Ann Arbor
Department
Biostatistics & Other Math Sci
Type
Organized Research Units
DUNS #
073133571
City
Ann Arbor
State
MI
Country
United States
Zip Code
48109
Couper, Mick P; Zhang, Chan (2016) Helping Respondents Provide Good Answers in Web Surveys. Surv Res Methods 10:49-64
Couper, Mick P; Tourangeau, Roger; Conrad, Frederick G et al. (2013) The Design of Grids in Web Surveys. Soc Sci Comput Rev 31:322-345
Couper, Mick P; Kennedy, Courtney; Conrad, Frederick G et al. (2011) Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys. J Off Stat 27:65-85
Peytchev, Andy; Conrad, Frederick G; Couper, Mick P et al. (2010) Increasing Respondents' Use of Definitions in Web Surveys. J Off Stat 26:633-650
Conrad, Frederick G; Couper, Mick P; Tourangeau, Roger et al. (2010) The impact of progress indicators on task completion. Interact Comput 22:417-427
Galesic, Mirta; Tourangeau, Roger; Couper, Mick P et al. (2008) Eye-Tracking Data: New Insights on Response Order Effects and Other Cognitive Shortcuts in Survey Responding. Public Opin Q 72:892-913