The goal of this project is to discover principles for improving interface designs for web surveys and similar applications. Although we focus on web surveys, but believe our studies will have broader implications for the design of web pages that convey information as well as capture it and for other interfaces as well. Our proposed studies examine three areas - screen design, methods for conveying definitions, and differences between experts and novices. All three areas raise important theoretical and practical questions, and all three have been the subject of relatively little prior research.
Our first aim i s to determine the principles governing the best arrangement of material on a web page. Users prefer intermediate levels of visual complexity, but other features (such as a logical arrangement of the visual elements) are more critical than sheer complexity.
Our second aim i s to determine the most effective methods for conveying definitions and other instructions. Prior research suggests that respondents often ignore definitional material for a variety of reasons - they are reluctant to expend the effort to attend to it, they may not think they need, it, and they may be unwilling to apply the definition when, in fact, it would be useful. There has been little work on how to improve the effectiveness of definitions. We examine several approaches - stating a general rule, mentioning specific examples of the category, and avoiding definitions altogether by asking very specific questions.
Our third aim i s to investigate differences between expert and novice users. Many interfaces are designed to accommodate novice or even first-time users and thus may include features that are inefficient for experts. In addition, experts may fall into habits that are reasonably effective but not optimal. We propose four experiments to contrast the performance of experts and novice users, to demonstrate the poor performance of experts with many interfaces, and to develop methods for accommodating both types of users. Although our basic method for exploring each of these issues is to conduct experiments embedded in large-scale, realistic web surveys, we plan to collect various types of """"""""paradata"""""""" (such as response times and patterns of mouse clicks) in addition to answers to the survey questions. We propose several follow-up experiments that use eye-tracking equipment to clarify what information respondents actually attended to as they completed the surveys. This project is motivated by the explosive growth of the web as a vehicle for collecting and disseminating public health information. Our studies will help public health researchers gather more accurate information;in addition, our results will also be useful to health workers who are using the web to convey public health information or to deliver health interventions.

Agency
National Institute of Health (NIH)
Institute
Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD)
Type
Research Project (R01)
Project #
5R01HD041386-07
Application #
7858185
Study Section
Social Psychology, Personality and Interpersonal Processes Study Section (SPIP)
Program Officer
Bures, Regina M
Project Start
2007-05-01
Project End
2012-04-30
Budget Start
2010-05-01
Budget End
2012-04-30
Support Year
7
Fiscal Year
2010
Total Cost
$263,809
Indirect Cost
Name
University of Michigan Ann Arbor
Department
Biostatistics & Other Math Sci
Type
Organized Research Units
DUNS #
073133571
City
Ann Arbor
State
MI
Country
United States
Zip Code
48109
Couper, Mick P; Zhang, Chan (2016) Helping Respondents Provide Good Answers in Web Surveys. Surv Res Methods 10:49-64
Couper, Mick P; Tourangeau, Roger; Conrad, Frederick G et al. (2013) The Design of Grids in Web Surveys. Soc Sci Comput Rev 31:322-345
Couper, Mick P; Kennedy, Courtney; Conrad, Frederick G et al. (2011) Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys. J Off Stat 27:65-85
Peytchev, Andy; Conrad, Frederick G; Couper, Mick P et al. (2010) Increasing Respondents' Use of Definitions in Web Surveys. J Off Stat 26:633-650
Conrad, Frederick G; Couper, Mick P; Tourangeau, Roger et al. (2010) The impact of progress indicators on task completion. Interact Comput 22:417-427
Galesic, Mirta; Tourangeau, Roger; Couper, Mick P et al. (2008) Eye-Tracking Data: New Insights on Response Order Effects and Other Cognitive Shortcuts in Survey Responding. Public Opin Q 72:892-913