Asking Probing Questions in Web Surveys

Cognitive interviewing is a well-established method for evaluating and improving a questionnaire prior to fielding. However, its present implementation brings with it some challenges, notably in terms of small sample sizes or the possibility of interviewer effects. In this study, the authors test web surveys through nonprobability online panels as a supplemental means to implement cognitive interviewing techniques. The overall goal is to tackle the above-mentioned challenges. The focus in this article is on methodological features that pave the way for an eventual successful implementation of category-selection probing in web surveys. The study reports on the results of 1,023 respondents from Germany. In order to identify implementation features that lead to a high number of meaningful answers, the authors explore the effects of (1) different panels, (2) different probing variants, and (3) different numbers of preceding probes on answer quality. The overall results suggest that category-selection probing can indeed be implemented in web surveys. Using data from two panels—a community panel where members can actively get involved, for example, by creating their own polls, and a “conventional” panel where answering surveys is the members' only activity—the authors find that high community involvement does not increase the likelihood to answer probes or produce longer statements. Testing three probing variants that differ in wording and provided context, the authors find that presenting the context of the probe (i.e., the probed item and the respondent’s answer) produces a higher number of meaningful answers. Finally, the likelihood to answer a probe decreases with the number of preceding probes. However, the word count of those who eventually answer the probes slightly increases with an increasing number of probes.

[1]  Mirta Galesic,et al.  Dropouts on the Web: Influence of changes in respondents' interest and perceived burden during the Web survey , 2004 .

[2]  Lars Kaczmirek,et al.  Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys , 2013 .

[3]  Jolene D. Smyth,et al.  Helping Respondents Get It Right the First Time: The Influence of Words, Symbols, and Graphics in Web Surveys , 2007 .

[4]  Martyn Denscombe,et al.  The Length of Responses to Open-Ended Questions , 2008 .

[5]  J. Krosnick,et al.  Survey research. , 1999, Annual review of psychology.

[6]  F. Conrad,et al.  The Effect of Sample Size on Cognitive Interview Findings , 2006 .

[7]  Lars Kaczmirek,et al.  Testing the Validity of Gender Ideology Items by Implementing Probing Questions in Web Surveys , 2013 .

[8]  Jon A. Krosnick,et al.  Research Synthesis AAPOR Report on Online Panels , 2010 .

[9]  D. Dillman,et al.  Does visual appeal matter? : effects of web survey aesthetics on survey quality , 2010 .

[10]  Jolene D. Smyth,et al.  Open-Ended Questions in Web Surveys Can Increasing the Size of Answer Boxes and Providing Extra Verbal Instructions Improve Response Quality? , 2009 .

[11]  M. Galesic Dropouts on the web: Effects of interest and burden experienced during an online survey , 2007 .

[12]  RANDOM PROBES OF GSS QUESTIONS , 1989 .

[13]  H. Schuman,et al.  The random probe: a technique for evaluating the validity of closed questions. , 1966, American sociological review.

[14]  G. Willis,et al.  Research Synthesis: The Practice of Cognitive Interviewing , 2007 .

[15]  Stéphane Ganassali,et al.  The influence of the design of web survey questionnaires on the quality of responses , 2008 .

[16]  Leah Melani Christian,et al.  The Influence of Topic Interest and Interactive Probing on Responses to Open-Ended Questions in Web Surveys , 2009 .

[17]  F. Conrad,et al.  Use and non-use of clarification features in web surveys , 2006 .

[18]  Frederick G. Conrad,et al.  Sources of Error in Cognitive Interviews , 2009 .