Optimal Response Formats for Online Surveys: Branch, Grid, or Single Item?

This article reports the results of an experiment comparing branch, grid, and single-item question formats in an internet survey with a nationally representative probability sample. We compare the three formats in terms of administration time, item nonresponse, survey breakoff rates, response distribution, and criterion validity. On average, the grid format obtained the fastest answers, the single-item format was intermediate, and the branch format took the longest. Item nonresponse rates were lowest for the single-item format, intermediate for the grid, and highest for branching, but these results were not statistically significant when modeling the full experimental design. Survey breakoff rates among the formats are not statistically distinguishable. Criterion validity was weakest in the branching format, and there was no significant difference between the grid and single-item formats. This evidence indicates that the branching format is not well suited to internet data collection and that both single-item and short, well-constructed grids are better question formats.

[1]  Diana C. Mutz,et al.  The Perils of Balance Testing in Experimental Design: Messy Analyses of Clean Data , 2019 .

[2]  Alexandru Cernat,et al.  Item-by-item Versus Matrix Questions , 2018 .

[3]  Tobias Gummer,et al.  Mitigating Satisficing in Cognitively Demanding Grid Questions: Evidence from Two Web-Based Experiments , 2018 .

[4]  Alexander Jedinger The effects of rating scale format on the measurement of policy attitudes in web surveys , 2017 .

[5]  Mick P. Couper,et al.  Why Do Web Surveys Take Longer on Smartphones? , 2017 .

[6]  Carlos Ochoa,et al.  An experiment comparing grids and item-by-item formats in web surveys completed through PCs and smartphones , 2017, Telematics Informatics.

[7]  Emily E. Gilbert A Comparison of Branched Versus Unbranched Rating Scales for the Measurement of Attitudes in Surveys , 2015 .

[8]  Frederick G. Conrad,et al.  Speeding in Web Surveys: The tendency to answer very fast and its association with straightlining , 2014 .

[9]  J. Krosnick,et al.  Optimal Design of Branching Questions to Measure Bipolar Constructs , 2009 .

[10]  Vera Toepoel,et al.  Design of Web Questionnaires: The Effects of the Number of Items per Screen , 2009 .

[11]  Roger Tourangeau,et al.  Eye-Tracking Data: New Insights on Response Order Effects and Other Cognitive Shortcuts in Survey Responding. , 2008, Public opinion quarterly.

[12]  Mick P. Couper,et al.  Designing Effective Web Surveys by Mick P. Couper , 2008 .

[13]  Mick P. Couper,et al.  Designing Effective Web Surveys , 2008 .

[14]  Mick P. Couper,et al.  Web Survey Design Paging versus Scrolling , 2006 .

[15]  Scott Fricker,et al.  An Experimental Comparison of Web and Telephone Surveys , 2005 .

[16]  F. Conrad,et al.  Spacing, Position, and Order Interpretive Heuristics for Visual Features of Survey Questions , 2004 .

[17]  Mick P Couper,et al.  Mode effects for collecting alcohol and other drug use data: Web and U.S. mail. , 2002, Journal of studies on alcohol.

[18]  D. Torgerson,et al.  Improving the measurement of quality of life in older people: the York SF-12. , 2001, QJM : monthly journal of the Association of Physicians.

[19]  Douglas S. Bell,et al.  Research Paper: Randomized Testing of Alternative Survey Formats Using Anonymous Volunteers on the World Wide Web , 2001, J. Am. Medical Informatics Assoc..

[20]  M. Traugott,et al.  Web survey design and administration. , 2001, Public opinion quarterly.

[21]  M. Couper,et al.  Computer Assisted Survey Information Collection , 1998 .

[22]  Jon A. Krosnick,et al.  Comparisons of Party Identification and Policy Preferences: The Impact of Survey Question Format , 1993 .

[23]  Leslie F. Clark,et al.  RATING SCALES NUMERIC VALUES MAY CHANGE THE MEANING OF SCALE LABELS , 1991 .

[24]  Jon A. Krosnick,et al.  The Reliability of Survey Attitude Measurement , 1991 .

[25]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .

[26]  E. A. Cook,et al.  SOME LIKE IT HOT INDIVIDUAL DIFFERENCES IN RESPONSES TO GROUP FEELING THERMOMETERS , 1989 .

[27]  J. Krosnick,et al.  AN EVALUATION OF A COGNITIVE THEORY OF RESPONSE-ORDER EFFECTS IN SURVEY MEASUREMENT , 1987 .

[28]  F. M. Andrews Construct Validity and Error Components of Survey Measures: A Structural Modeling Approach , 1984 .

[29]  Jan Karem Höhne,et al.  Investigating response order effects in web surveys using eye tracking , 2015 .