The Implications of Alternative Allocation Criteria in Adaptive Design for Panel Surveys

Abstract Adaptive survey designs can be used to allocate sample elements to alternative data collection protocols in order to achieve a desired balance between some quality measure and survey costs. We compare four alternative methods for allocating sample elements to one of two data collection protocols. The methods differ in terms of the quality measure that they aim to optimize: response rate, R-indicator, coefficient of variation of the participation propensities, or effective sample size. Costs are also compared for a range of sample sizes. The data collection protocols considered are CAPI single-mode and web-CAPI sequential mixed-mode. We use data from a large experiment with random allocation to one of these two protocols. For each allocation method we predict outcomes in terms of several quality measures and costs. Although allocating the whole sample to single-mode CAPI produces a higher response rate than allocating the whole sample to the mixed-mode protocol, we find that two of the targeted allocations achieve a better response rate than single-mode CAPI at a lower cost. We also find that all four of the targeted designs out-perform both single-protocol designs in terms of representativity and effective sample size. For all but the smallest sample sizes, the adaptive designs bring cost savings relative to CAPI-only, though these are fairly modest in magnitude.

[1]  H. Cooper,et al.  A Quantitative Review of Research Design Effects on Response Rates to Questionnaires , 1983 .

[2]  Eleanor Singer,et al.  The Use and Effects of Incentives in Surveys , 2013 .

[3]  Jelke Bethlehem,et al.  Handbook of Nonresponse in Household Surveys , 2011 .

[4]  Carl-Erik Särndal,et al.  Aspects of Responsive Design with Applications to the Swedish Living Conditions Survey , 2013 .

[5]  James Wagner,et al.  A Comparison of Alternative Indicators for the Risk of Nonresponse Bias. , 2012, Public opinion quarterly.

[6]  Barry Schouten,et al.  Optimizing quality of response through adaptive survey designs , 2013 .

[7]  Daniel L. Oberski,et al.  Understanding Society Innovation Panel Wave 9 , 2017 .

[8]  Michael D. Kaplowitz,et al.  A Comparison of Web and Mail Survey Response Rates , 2004 .

[9]  Kosuke Imai,et al.  Survey Sampling , 1998, Nov/Dec 2017.

[10]  Natalie Shlomo,et al.  Evaluating, Comparing, Monitoring, and Improving Representativeness of Survey Response Through R‐Indicators and Partial R‐Indicators , 2012 .

[11]  Annamaria Bianchi,et al.  Understanding Society Innovation Panel Wave 9: Results from Methodological Experiments. Understanding Society Working Paper Series , 2017 .

[12]  Natalie Shlomo,et al.  Indicators for monitoring and improving representativeness of response , 2011 .

[13]  Jonathan Burton,et al.  Going Online with a Face-to-Face Household Panel: Effects of a Mixed Mode Design on Item and Unit Non-Response , 2015 .

[14]  James Wagner,et al.  Adaptive survey design to reduce nonresponse bias , 2008 .

[15]  L. Kanuk,et al.  Mail Surveys and Response Rates: A Literature Review , 1975 .

[16]  Roger Tourangeau,et al.  Issues Facing the Field: Alternative Practical Measures of Representativeness of Survey Respondent Pools , 2008 .

[17]  Jelke Bethlehem,et al.  Indicators for the representativeness of survey response , 2009 .

[18]  Robert M. Groves,et al.  Responsive design for household surveys: tools for actively controlling survey errors and costs , 2006 .

[19]  Sc Noah Uhrig,et al.  The nature and causes of attrition in the British Household Panel Study , 2008 .

[20]  Peter Lynn,et al.  Targeted Response Inducement Strategies on Longitudinal Surveys , 2013 .

[21]  Jonathan Burton,et al.  Understanding Society Innovation Panel Wave 2: results from methodological experiments , 2010 .

[22]  E. Ziegel,et al.  Nonresponse In Household Interview Surveys , 1998 .

[23]  Robert M. Groves,et al.  Effects of Interviewer Attitudes and Behaviors on Refusal in Household Surveys , 2010 .

[24]  Natalie Shlomo,et al.  Selecting Adaptive Survey Design Strata with Partial R‐indicators , 2017 .

[25]  Peter Lynn,et al.  From Standardised to Targeted Survey Procedures for Tackling Non-Response and Attrition , 2017 .

[26]  Gabriele B. Durrant,et al.  Modeling Final Outcome and Length of Call Sequence to Improve Efficiency in Interviewer Call Scheduling , 2015 .

[27]  Nicole Watson,et al.  Identifying Factors Affecting Longitudinal Survey Response , 2009 .

[28]  Natalie Shlomo,et al.  Designing Adaptive Survey Designs with R-Indicators , 2013 .

[29]  Peter Lynn,et al.  Methodology of longitudinal surveys , 2009 .

[30]  R. Groves,et al.  Survey Errors and Survey Costs. , 1991 .

[31]  Carl-Erik Särndal,et al.  The 2010 Morris Hansen lecture dealing with survey nonresponse in data collection, in estimation , 2011 .

[32]  Kim Bartel Sheehan,et al.  E-mail Survey Response Rates: A Review , 2006, J. Comput. Mediat. Commun..

[33]  Robert M. Groves,et al.  The Impact of Nonresponse Rates on Nonresponse Bias A Meta-Analysis , 2008 .

[34]  Laurie Thorp,et al.  The Effect of Invitation Design on Web Survey Response Rates , 2012 .