Drop Downs and Scroll Mice: The Effect of Response Option Format and Input Mechanism Employed on Data Quality in Web Surveys

Online researchers face whether to use radio buttons or drop downs when presenting respondents with “select one answer from many” questions. However, empirical evidence regarding response effects does not provide direction for favoring one. Using data collected in a New Zealand general population Web survey of 2,400 people, this study contributes to the decision process by investigating format response effects at multiple levels and exploring the potential for input mechanisms to interfere with drop-down answer selection. Format choice did not significantly affect survey completions, number of nonsubstantial answers, or time to completion. However, drop downs led to higher item nonresponse and longer response times. Furthermore, the 76% of respondents using scroll mice to complete the survey were prone to accidentally changing an answer if presented with drop-down questions. This increased average response times and skewed distribution of responses to the drop-down treatment questions toward the bottom of the response list. Implications of these findings for Web-based data collection are discussed.

[1]  Cihan Cobanoglu,et al.  The Effect of Incentives in Web Surveys: Application and Ethical Considerations , 2003 .

[2]  Mick P. Couper,et al.  Technology Trends in Survey Data Collection , 2005 .

[3]  Daniel B. Horn,et al.  The Effects of Cash, Electronic, and Paper Gift Certificates as Respondent Incentives for a Web-Based Survey of Technologically Sophisticated Respondents , 2004 .

[4]  Vasja Vehovar,et al.  Design of Web Survey Questionnaires: Three Basic Experiments , 2006, J. Comput. Mediat. Commun..

[5]  Russel L. Thompson,et al.  A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys , 2000 .

[6]  Tracy L. Tuten,et al.  Prepaid and promised incentives in web surveys: an experiment , 2003 .

[7]  Mick P. Couper,et al.  Usability Evaluation of Computer-Assisted Survey Instruments , 2000 .

[8]  G. Loosveldt,et al.  The effect of personalization on response rates and data quality in web surveys , 2005 .

[9]  Colleen Cook,et al.  Do Different Response Formats Change the Latent Structure of Responses? an Empirical Investigation Using Taxometric Analysis , 2001 .

[10]  Martin Wetzels,et al.  Response Rate and Response Quality of Internet-Based Surveys: An Experimental Study , 2004 .

[11]  P. Chisnall Mail and Internet Surveys: The Tailored Design Method , 2007, Journal of Advertising Research.

[12]  Michael D. Kaplowitz,et al.  A Comparison of Web and Mail Survey Response Rates , 2004 .

[13]  M. Traugott,et al.  Web survey design and administration. , 2001, Public opinion quarterly.

[14]  Dirk Heerwegh,et al.  Explaining Response Latencies and Changing Answers Using Client-Side Paradata from a Web Survey , 2003 .

[15]  Robert D. Tortora,et al.  Principles for Constructing Web Surveys , 1998 .

[16]  D. Dillman Mail and internet surveys: The tailored design method, 2nd ed. , 2007 .

[17]  Geert Loosveldt,et al.  An evaluation of the semiautomatic login procedure to control web survey access , 2003 .

[18]  Roger Tourangeau,et al.  What They See Is What We Get , 2004 .

[19]  Tracy L. Tuten,et al.  Effects of Immediate Versus Delayed Notification of Prize Draw Results on Response Behavior in Web Surveys , 2004 .

[20]  Anja S. Göritz,et al.  The impact of material incentives on response quantity, response quality, sample composition, survey outcome, and cost in online access panels , 2004 .

[21]  Geert Loosveldt,et al.  An Evaluation of the Effect of Response Formats on Data Quality in Web Surveys , 2002 .

[22]  Alex R. Trouteaud How You Ask Counts , 2004 .