Standardized interviewing procedures require survey interviewers to read questions as worded and provide only neutral or non-directive probes in response to questions from survey respondents. Even though many major surveys in the government, non-profit, and private sectors use standardized interviewing in an effort to minimize the effects of interviewers on data quality, a substantial body of research has indicated that interviewers using standardized interviewing still influence the responses provided by individuals. This variability among interviewers (or interviewer variance) in the types of survey responses collected reduces the precision of survey estimates and therefore has direct cost implications for survey data collection. Conversational interviewing is known to handle respondent clarification requests in a more effective manner. Interviewers are trained to read questions as worded initially and then say whatever is required to help respondents understand the questions. Despite existing research showing that conversational interviewing produces noticeable decreases in the measurement error bias of survey estimates, survey researchers (and government agencies in particular) have been hesitant to employ it in practice, in part because of increased questionnaire administration time but also due to the fear of increased interviewer effects on the survey data due to the conversational style. This research project will compare the interviewer variance, bias, and mean squared error arising in a variety of survey estimates from these two face-to-face interviewing techniques. It will decompose the total interviewer variance introduced by each technique into measurement error variance (i.e., variance among interviewers in systematic measurement errors) and nonresponse error variance (i.e., variance among interviewers in the types of individuals recruited for the survey). To meet these research objectives, this study will select a large random sample of persons from a unique economic database that contains known values for selected economic characteristics of interest. The researchers will assign random subsets of this sample to professional interviewers who have been randomly assigned to receive training in one of the two interviewing techniques. The two groups of interviewers will then administer a face-to-face survey to their assigned subsamples, collecting information on their economic characteristics. Analyses of the collected data will employ innovative statistical modeling techniques that enable comparisons of the interviewer variance, bias, and mean squared error values for a variety of economic survey estimates across the two groups of interviewers as well as decompositions of the total interviewer variance in each group into the two aforementioned sources of variance among interviewers. These analyses will provide survey researchers with the first empirical evidence of differences in these two techniques in terms of the overall quality of survey estimates and will uncover sources of the interviewer effects that can arise when using the two interviewing styles.

As a scientific field, survey methodology, which studies the science of survey data collection, is still relatively nascent, but billions of dollars in public resources are dedicated to survey data collection every year. This study will provide survey researchers with a more complete set of empirical evidence that will enable informed decisions about the face-to-face interviewing style that will yield survey estimates with the highest overall quality. The resulting knowledge regarding which interviewing technique produces higher-quality estimates therefore will lead to higher-quality information being collected in surveys that employ face-to-face interviewing and higher-quality decisions being made by policy makers who use survey data.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
1324689
Program Officer
Cheryl Eavey
Project Start
Project End
Budget Start
2013-09-15
Budget End
2017-08-31
Support Year
Fiscal Year
2013
Total Cost
$374,151
Indirect Cost
Name
Regents of the University of Michigan - Ann Arbor
Department
Type
DUNS #
City
Ann Arbor
State
MI
Country
United States
Zip Code
48109