A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe

Inferential statistics teach us that we need a random probability sample to infer from a sample to the general population. In online survey research, however, volunteer access panels, in which respondents self-select themselves into the sample, dominate the landscape. Such panels are attractive due to their low costs. Nevertheless, recent years have seen increasing numbers of debates about the quality, in particular about errors in the representativeness and measurement, of such panels. In this article, we describe four probability-based online and mixed-mode panels for the general population, namely, the Longitudinal Internet Studies for the Social Sciences (LISS) Panel in the Netherlands, the German Internet Panel (GIP) and the GESIS Panel in Germany, and the Longitudinal Study by Internet for the Social Sciences (ELIPSS) Panel in France. We compare them in terms of sampling strategies, offline recruitment procedures, and panel characteristics. Our aim is to provide an overview to the scientific community of the availability of such data sources to demonstrate the potential strategies for recruiting and maintaining probability-based online panels to practitioners and to direct analysts of the comparative data collected across these panels to methodological differences that may affect comparative estimates.

[1]  Angela P. Wetzel Internet, mail, and mixed‐mode surveys: The tailored design method , 2010 .

[2]  Vera Toepoel,et al.  Recruiting a Probability Sample for an Online Panel Effects of Contact Mode, Incentives, and Information , 2012 .

[3]  J. G. Bethlehem,et al.  How representative are online panels? Problems of coverage and selection and possible solutions , 2011 .

[4]  A. Blom,et al.  Interviewer Effects on Nonresponse in the European Social Survey , 2010 .

[5]  Roger Tourangeau,et al.  The Science of Web Surveys , 2013 .

[6]  Who Can Be Contacted by Phone? Lessons from Switzerland , 2011 .

[7]  Coverage- und Nonresponse-Effekte bei Online-Bevölkerungsumfragen , 2009 .

[8]  F. Al-Shamali,et al.  Author Biographies. , 2015, Journal of social work in disability & rehabilitation.

[9]  D. Yeager,et al.  Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples , 2011 .

[10]  Seppo Laaksonen,et al.  Methods for achieving equivalence of samples in cross-national surveys: the European Social Survey experience , 2004 .

[11]  Jon A. Krosnick,et al.  Research Synthesis AAPOR Report on Online Panels , 2010 .

[12]  Russel L. Thompson,et al.  A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys , 2000 .

[13]  A. C. Scherpenzeel,et al.  Does It Pay Off to Include Non-Internet Households in an Internet Panel? , 2013 .

[14]  A. Scherpenzeel,et al.  ““True” Longitudinal and Probability-Based Internet Panels: Evidence From the Netherlands , 2018, Social and Behavioral Research and the Internet.

[15]  Eleanor Singer,et al.  The Use and Effects of Incentives in Surveys , 2013 .

[16]  O. Lipps,et al.  Nonresponse in an Individual Register Sample Telephone Survey in Lucerne (Switzerland) , 2011 .

[17]  Mario Callegaro Do You Know Which Device Your Respondent Has Used to Take Your Online Survey , 2010 .

[18]  A. Blom Setting Priorities: Spurious Differences in Response Rates , 2014 .

[19]  K. Manfreda,et al.  Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates , 2008 .

[20]  Peter Lynn,et al.  Developing quality standards for cross-national survey research: five approaches , 2003 .

[21]  F. Kreuter,et al.  Social Desirability Bias in CATI, IVR, and Web Surveys The Effects of Mode and Question Sensitivity , 2008 .

[22]  J. Hox,et al.  The Influence of Interviewers’ Contact Behavior on the Contact and Cooperation Rate in Face-to-Face Household Surveys , 2006 .

[23]  Morgan M. Millar,et al.  Improving Response To Web and Mixed-Mode Surveys , 2011 .

[24]  John Van Hoewyk,et al.  The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys , 1999 .

[25]  M. Crask,et al.  MAIL SURVEY RESPONSE RATE A META-ANALYSIS OF SELECTED TECHNIQUES FOR INDUCING RESPONSE , 1988 .

[26]  Anja S. Göritz,et al.  Incentives in Web Studies: Methodological Issues and a Review , 2006 .

[27]  A. Blom,et al.  Setting Up an Online Panel Representative of the General Population , 2015 .

[28]  Marek Fuchs,et al.  The components of landline telephone survey coverage bias. The relative importance of no-phone and mobile-only populations , 2012 .

[29]  L. Cherchye,et al.  Married with Children: A Collective Labor Supply Model with Detailed Time Use and Intrahousehold Expenditure Information , 2010, The American economic review.

[30]  Mario Callegaro,et al.  Mobile technologies for conducting, augmenting and potentially replacing surveys , 2014 .

[31]  Yvonne Jaeger,et al.  Designing Effective Web Surveys , 2016 .

[32]  A. H. Church ESTIMATING THE EFFECT OF INCENTIVES ON MAIL SURVEY RESPONSE RATES: A META-ANALYSIS , 1993 .

[33]  A. Peytchev,et al.  Multiple Sources of Nonobservation Error in Telephone Surveys: Coverage and Nonresponse , 2011 .

[34]  Mario Callegaro,et al.  A critical review of studies investigating the quality of data obtained with online panels based on probability and nonprobability samples1 , 2014 .