Open-ended versus Closed Probes: Assessing Different Formats of Web Probing

The method of web probing integrates cognitive interviewing techniques into web surveys and is increasingly used to evaluate survey questions. In a usual web probing scenario, probes are administered immediately after the question to be tested (concurrent probing), typically as open-ended questions. A second possibility of administering probes is in a closed format, whereby the response categories for the closed probes are developed during previously conducted qualitative cognitive interviews. Using closed probes has several benefits, such as reduced costs and time efficiency, because this method does not require manual coding of open-ended responses. In this article, we investigate whether the insights gained into item functioning when implementing closed probes are comparable to the insights gained when asking open-ended probes and whether closed probes are equally suitable to capture the cognitive processes for which traditionally open-ended probes are intended. The findings reveal statistically significant differences with regard to the variety of themes, the patterns of interpretation, the number of themes per respondent, and nonresponse. No differences in number of themes across formats by sex and educational level were found.

[1]  Jolene D. Smyth,et al.  Open-Ended Questions in Web Surveys Can Increasing the Size of Answer Boxes and Providing Extra Verbal Instructions Improve Response Quality? , 2009 .

[2]  Mick P. Couper,et al.  Revilla_et_al_Online_Supplement - Testing the Use of Voice Input in a Smartphone Web Survey , 2018 .

[3]  G. Willis,et al.  Research Synthesis: The Practice of Cognitive Interviewing , 2007 .

[4]  Katharina Meitinger Necessary but Insufficient , 2017, Public opinion quarterly.

[5]  M. Couper,et al.  Web Surveys , 2001 .

[6]  Stanley Presser,et al.  The Open and Closed Question , 1979 .

[7]  Oddgeir Friborg,et al.  A comparison of open-ended and closed questions in the prediction of mental health , 2013 .

[8]  John G. Geer,et al.  DO OPEN-ENDED QUESTIONS MEASURE “SALIENT” ISSUES? , 1991 .

[9]  William L. Fleisher,et al.  Cognitive Interviewing , 2019, Effective Interviewing and Interrogation Techniques.

[10]  P. Scanlon Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings , 2020 .

[11]  I. Borg,et al.  Write‐in comments in employee surveys , 2012 .

[12]  Lars Kaczmirek,et al.  Cognitive Probes in Web Surveys , 2014 .

[13]  W. Foddy An Empirical Evaluation of In-Depth Probes Used to Pretest Survey Questions , 1998 .

[14]  Martyn Denscombe,et al.  The Length of Responses to Open-Ended Questions , 2008 .

[15]  Cornelia Zuell,et al.  What can we learn from open questions in surveys? A case study on non-voting reported in the 2013 German longitudinal election study , 2020 .

[16]  H. Schuman,et al.  The random probe: a technique for evaluating the validity of closed questions. , 1966, American sociological review.

[17]  Mario Callegaro,et al.  Computing Response Metrics for Online Panels , 2008 .

[18]  Leah Melani Christian,et al.  The Influence of Topic Interest and Interactive Probing on Responses to Open-Ended Questions in Web Surveys , 2009 .

[19]  Cornelia Zuell,et al.  Item non-response in open-ended questions: Who does not answer on the meaning of left and right? , 2012, Social science research.