Probability-based internet survey panels potentially provide survey responses from a sample representative of the general population at low cost and with the ability to provide complex information treatments. Despite using random digit dialing to recruit the panelists, this technique raises concerns about the representativeness of survey responses given the multiple stages of panelist recruitment and retention. Relatively little also is known about how mode of administration, such as taking a survey on a computer, affects survey response. This research project will pursue answers to the following questions: (1) What nonresponse biases are introduced by mode-of-administration and sampling strategies? Specifically, do probability-based internet panels introduce more bias than other modes/sampling strategies? (2) After controlling for sample selection, do the mail and internet modes of administration affect the variable of interest and question response patterns? (3) Does the survey perform equally well for each mode/sampling strategy as gauged by standard tests of validity? To address these questions, four treatment approaches will be used to administer a survey on the willingness to pay for ecosystem improvements in the Southern Appalachian Mountains, crossing two sampling strategies (panel, random digit dialing) with two survey modes (paper, computer) and a fifth treatment in which respondents recruited through random digit dialing may take the survey on a computer (by internet) or by mail. If people who self-select into a mail (or internet) survey answer differently than people who are assigned to a mail (or internet) mode, this would imply that there are characteristics of people that can lead to biased willingness to pay depending on mode. State-of-the-art techniques will be used for evaluating non-ignorable nonresponse (or selection on unobservables) along with a rich variety of frame variables including Census tract characteristics. The project will provide greater understanding of the factors affecting response propensity and whether response propensity is correlated with the survey outcomes, focusing on the internet/panel combination.

The U.S. Office of Management and Budget has recently issued guidance that requires a non-response bias analysis for surveys expecting a response rate less than 80 percent. The guidance does not clarify how such analyses are to be conducted, however, and there is relatively little survey methodology research regarding how such analyses should be conducted and the factors that affect survey response. This project proposes and evaluates methods for testing the effects of nonresponse on sample representativeness. These issues also are of concern to the U.S. Environmental Protection Agency, which co-funded this project through a competitive grant from the National Center for Environmental Economics.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
0720101
Program Officer
Cheryl L. Eavey
Project Start
Project End
Budget Start
2007-10-01
Budget End
2012-09-30
Support Year
Fiscal Year
2007
Total Cost
$45,714
Indirect Cost
Name
Resources for the Future Inc
Department
Type
DUNS #
City
Washington
State
DC
Country
United States
Zip Code
20036