The goal of this project is to discover principles for improving interface designs for web surveys and similar applications. Although we focus on web surveys, but believe our studies will have broader implications for the design of web pages that convey information as well as capture it and for other interfaces as well. Our proposed studies examine three areas - screen design, methods for conveying definitions, and differences between experts and novices. All three areas raise important theoretical and practical questions, and all three have been the subject of relatively little prior research.
Our first aim i s to determine the principles governing the best arrangement of material on a web page. Users prefer intermediate levels of visual complexity, but other features (such as a logical arrangement of the visual elements) are more critical than sheer complexity.
Our second aim i s to determine the most effective methods for conveying definitions and other instructions. Prior research suggests that respondents often ignore definitional material for a variety of reasons - they are reluctant to expend the effort to attend to it, they may not think they need, it, and they may be unwilling to apply the definition when, in fact, it would be useful. There has been little work on how to improve the effectiveness of definitions. We examine several approaches - stating a general rule, mentioning specific examples of the category, and avoiding definitions altogether by asking very specific questions.
Our third aim i s to investigate differences between expert and novice users. Many interfaces are designed to accommodate novice or even first-time users and thus may include features that are inefficient for experts. In addition, experts may fall into habits that are reasonably effective but not optimal. We propose four experiments to contrast the performance of experts and novice users, to demonstrate the poor performance of experts with many interfaces, and to develop methods for accommodating both types of users. Although our basic method for exploring each of these issues is to conduct experiments embedded in large-scale, realistic web surveys, we plan to collect various types of """"""""paradata"""""""" (such as response times and patterns of mouse clicks) in addition to answers to the survey questions. We propose several follow-up experiments that use eye-tracking equipment to clarify what information respondents actually attended to as they completed the surveys. This project is motivated by the explosive growth of the web as a vehicle for collecting and disseminating public health information. Our studies will help public health researchers gather more accurate information;in addition, our results will also be useful to health workers who are using the web to convey public health information or to deliver health interventions.

National Institute of Health (NIH)
Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD)
Research Project (R01)
Project #
Application #
Study Section
Social Psychology, Personality and Interpersonal Processes Study Section (SPIP)
Program Officer
Bures, Regina M
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Michigan Ann Arbor
Biostatistics & Other Math Sci
Organized Research Units
Ann Arbor
United States
Zip Code
Couper, Mick P; Tourangeau, Roger; Conrad, Frederick G et al. (2013) The Design of Grids in Web Surveys. Soc Sci Comput Rev 31:322-345
Couper, Mick P; Kennedy, Courtney; Conrad, Frederick G et al. (2011) Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys. J Off Stat 27:65-85
Conrad, Frederick G; Couper, Mick P; Tourangeau, Roger et al. (2010) The impact of progress indicators on task completion. Interact Comput 22:417-427
Peytchev, Andy; Conrad, Frederick G; Couper, Mick P et al. (2010) Increasing Respondents' Use of Definitions in Web Surveys. J Off Stat 26:633-650
Galesic, Mirta; Tourangeau, Roger; Couper, Mick P et al. (2008) Eye-Tracking Data: New Insights on Response Order Effects and Other Cognitive Shortcuts in Survey Responding. Public Opin Q 72:892-913