Investigating recruitment and completion mode biases in online and door to door electronic surveys

Abstract Electronically assisted survey techniques offer several advantages over traditional survey techniques. However, they can also potentially introduce biases, such as coverage biases and measurement error. The current study compares the relative merits of two survey distribution and completion modes: email recruitment with internet completion; and door to door recruitment with either tablet or internet completion. Presentation mode is held constant so that we are able to separate the impacts of recruitment mode and completion mode on responses. Recruitment mode appeared to influence both response rates and which socio-demographic groups were represented. However, the difference between the two recruitment modes was relatively small. Completion mode appeared to have little or no impact on responses although it did influence completion times. The email distributed survey performed better with regard to time costs and the number of respondents obtained. Differences between the two survey modes appeared to be largely due to recruitment method rather than completion mode.

[1]  Michael Bosnjak,et al.  A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe , 2016 .

[2]  Frederick G. Conrad,et al.  Speeding in Web Surveys: The tendency to answer very fast and its association with straightlining , 2014 .

[3]  Mick P. Couper,et al.  A Meta-Analysis of Breakoff Rates in Mobile Web Surveys , 2015 .

[4]  Duncan David Nulty,et al.  The adequacy of response rates to online and paper surveys: what can be done? , 2008 .

[5]  Corey Greenlaw,et al.  A Comparison of Web-Based and Paper-Based Survey Methods , 2009, Evaluation review.

[6]  Neil Malhotra,et al.  Completion Time and Response Order Effects in Web Surveys , 2008 .

[7]  Michael W. Link,et al.  Comparison of Smartphone and Online Computer Survey Administration , 2014 .

[8]  M. Couper A REVIEW OF ISSUES AND APPROACHES , 2000 .

[9]  L. Shrum,et al.  The measurement of personal values in survey research: a test of alternative rating procedures. , 2000, Public opinion quarterly.

[10]  F. Kreuter,et al.  Social Desirability Bias in CATI, IVR, and Web Surveys The Effects of Mode and Question Sensitivity , 2008 .

[11]  Vidal Díaz de Rada,et al.  The quality of responses to grid questions as used in Web questionnaires (compared with paper questionnaires) , 2015 .

[12]  Geert Loosveldt,et al.  Face-to-Face versus Web Surveying in a High-Internet-Coverage Population Differences in Response Quality , 2008 .

[13]  Jon A. Krosnick,et al.  The Reliability of Survey Attitude Measurement , 1991 .

[14]  Louisa Ha,et al.  Effects of Data Collection Mode and Response Entry Device on Survey Response Quality , 2016 .

[15]  Syed Tariq Sadiq,et al.  A randomised controlled trial of computer-assisted interviewing in sexual health clinics , 2010, Sexually Transmitted Infections.

[16]  Martyn Denscombe,et al.  Item non‐response rates: a comparison of online and paper questionnaires , 2009 .

[17]  Craig Leisher A Comparison of Tablet-Based and Paper-Based Survey Data Collection in Conservation Projects , 2014 .

[18]  Mick P. Couper,et al.  Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference? , 2013 .

[19]  Dirk Heerwegh,et al.  Mode Differences Between Face-to-Face and Web Surveys: An Experimental Investigation of Data Quality and Social Desirability Effects , 2009 .

[20]  F. Anseel,et al.  Response Rates in Organizational Science, 1995–2008: A Meta-analytic Review and Guidelines for Survey Researchers , 2010 .

[21]  M. Sullman,et al.  Social desirability and self-reported driving behaviours: Should we be worried? , 2010 .

[22]  Mick P. Couper,et al.  Web Survey Design Paging versus Scrolling , 2006 .

[23]  Zheng Yan,et al.  Factors affecting response rates of the web survey: A systematic review , 2010, Comput. Hum. Behav..

[24]  F. Conrad,et al.  Visual context effects in web surveys , 2007 .

[25]  Paula Vicente,et al.  Using Questionnaire Design to Fight Nonresponse Bias in Web Surveys , 2010 .

[26]  Joachim De Weerdt,et al.  Improving consumption measurement and other survey data through CAPI: Evidence from a randomized experiment , 2012 .

[27]  Mick P. Couper,et al.  Why Do Web Surveys Take Longer on Smartphones? , 2017 .

[28]  Ivar Krumpal Determinants of social desirability bias in sensitive surveys: a literature review , 2013 .

[29]  Regina T. Riphahn,et al.  Item non-response on income and wealth questions , 2002, SSRN Electronic Journal.

[30]  Tse-Hua Shih,et al.  Comparing Response Rates from Web and Mail Surveys: A Meta-Analysis , 2008 .

[31]  Frauke Kreuter,et al.  Using paradata to explore item level response times in surveys , 2013 .

[32]  Angie L. Miller,et al.  Living with Smartphones: Does Completion Device Affect Survey Responses? , 2015 .

[33]  Robert D. Tortora,et al.  Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (IVR) and the Internet , 2009 .

[34]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .

[35]  Melanie C. Green,et al.  Telephone versus Face-to-Face Interviewing of National Probability Samples with Long Questionnaires: Comparisons of Respondent Satisficing and Social Desirability Response Bias , 2003 .

[36]  Kumar Nagaraja Rao,et al.  Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey , 2012 .

[37]  Benjamin T. Bowyer,et al.  Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey* , 2015, Political Science Research and Methods.

[38]  Henrik Lindhjem,et al.  Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes , 2011 .

[39]  Matthias Schonlau,et al.  Selection Bias in Web Surveys and the Use of Propensity Scores , 2006 .

[40]  K. Manfreda,et al.  Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates , 2008 .

[41]  A. Bowling Mode of questionnaire administration can have serious effects on data quality. , 2005, Journal of public health.

[42]  Roger Tourangeau,et al.  The Science of Web Surveys , 2013 .

[43]  Vera Toepoel,et al.  The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey , 2016 .

[44]  Willem E. Saris,et al.  A Comparison of the Quality of Questions in a Face-to-face and a Web Survey , 2013 .