Exploring the Effects of Removing “Too Fast” Responses and Respondents from Web Surveys

This paper addresses speeding, that is, “too fast” responses, in web surveys. Relying on the response process model, we argue that very short response times indicate low data quality, stemming from a lack of attention on the part of respondents. To identify speeding, prior research employed case-wise procedures. Using data from nine online surveys, we demonstrate that response behavior of individual respondents varies considerably during a survey. Thus, we use case- and page-wise procedures to capture speeding behavior that taps different, although related, phenomena. Moreover, page-specific speeding measures capture aspects of data quality that traditional quality measures do not cover. Employing both page-specific and case-wise speeding measures, we examine whether removing speeders makes a difference in substantive findings. The evidence indicates that removing “too fast” responses does not alter marginal distributions, irrespective of which speeder-correction technique is employed. Moreover, explanatory models yield, by and large, negligible coefficient differences (on average about one standard error). Only in exceptional cases differences exceed two standard errors. Our findings suggest that speeding primarily adds some random noise to the data and attenuate correlations, if it makes a difference at all. The paper concludes by discussing implications and limitations.

[1]  Mick P. Couper,et al.  Technology Trends in Survey Data Collection , 2005 .

[2]  L. Rips,et al.  The Psychology of Survey Response , 2000 .

[3]  Anja S Göritz,et al.  Individual payments as a longer-term incentive in online panels , 2008, Behavior research methods.

[4]  Harald Schoen,et al.  Putting a questionnaire on the web is not enough- a comparison of online and offline surveys conducted in the context of the German federal election 2002 , 2006 .

[5]  Roger Tourangeau,et al.  Survey research and societal change. , 2004, Annual review of psychology.

[6]  R. Tourangeau,et al.  Cognitive Processes Underlying Context Effects in Attitude Measurement , 1988 .

[7]  Dirk Heerwegh,et al.  Explaining Response Latencies and Changing Answers Using Client-Side Paradata from a Web Survey , 2003 .

[8]  David Sanders,et al.  Does Mode Matter For Modeling Political Choice? Evidence From the 2005 British Election Study , 2007, Political Analysis.

[9]  Herbert F. Weisberg,et al.  The American Voter Revisited , 2008 .

[10]  J. Krosnick,et al.  AN EVALUATION OF A COGNITIVE THEORY OF RESPONSE-ORDER EFFECTS IN SURVEY MEASUREMENT , 1987 .

[11]  Harald Schoen,et al.  The impact of speeding on data quality in nonprobability and freshly recruited probability‐based online panels , 2014 .

[12]  Mick P. Couper,et al.  Usability Evaluation of Computer-Assisted Survey Instruments , 2000 .

[13]  Jon A. Krosnick,et al.  Comparing Oral Interviewing with Self-Administered Computerized QuestionnairesAn Experiment , 2010 .

[14]  R. Groves,et al.  Survey Errors and Survey Costs. , 1991 .

[15]  Ronald P. Carver,et al.  Reading Rate: A Review of Research and Theory , 1990 .

[16]  A. Göritz Recruitment for online access panels , 2004 .

[17]  Mario Callegaro,et al.  Response Latency as an Indicator of Optimizing in Online Questionnaires , 2009 .

[18]  P. Converse,et al.  The American voter , 1960 .

[19]  Robert Cameron Mitchell,et al.  The Impact of "No Opinion" Response Options on Data Quality: Non-Attitude Reduction or an Invitation to Satisfice? , 2001 .

[20]  Jon A. Krosnick,et al.  EDUCATION MODERATES SOME RESPONSE EFFECTS IN ATTITUDE MEASUREMENT , 1996 .

[21]  F. Conrad,et al.  Spacing, Position, and Order Interpretive Heuristics for Visual Features of Survey Questions , 2004 .

[22]  Robert M. Groves,et al.  Survey Errors and Survey Costs: Groves/Survey Errors , 2005 .

[23]  Lars Kaczmirek,et al.  Human-Survey Interaction : Usability and Nonresponse in Online Surveys , 2008 .

[24]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .

[25]  R. Tourangeau,et al.  Fast times and easy questions: the effects of age, experience and question complexity on web survey response times , 2008 .

[26]  Henry E. Brady,et al.  Voice and Equality: Civic Voluntarism in American Politics , 1996 .

[27]  Neil Malhotra,et al.  Completion Time and Response Order Effects in Web Surveys , 2008 .

[28]  Geert Loosveldt,et al.  An Evaluation of the Effect of Response Formats on Data Quality in Web Surveys , 2002 .

[29]  Roger Tourangeau,et al.  The Science of Web Surveys , 2013 .

[30]  Ronald P. Carver,et al.  Reading Rate: Theory, Research, and Practical Implications. , 1992 .

[31]  T. Zandt Analysis of Response Time Distributions , 2002 .

[32]  Vera Toepoel,et al.  Effects of Design in Web Surveys Comparing Trained and Fresh Respondents , 2008 .

[33]  R. Ratcliff Methods for dealing with reaction time outliers. , 1993, Psychological bulletin.

[34]  R. Kahn,et al.  The Dynamics of Interviewing , 1957 .

[35]  Olena Kaminska,et al.  Satisficing Among Reluctant Respondents in a Cross-National Context , 2010 .

[36]  Geert Loosveldt,et al.  Face-to-Face versus Web Surveying in a High-Internet-Coverage Population Differences in Response Quality , 2008 .

[37]  Jon A. Krosnick,et al.  Research Synthesis AAPOR Report on Online Panels , 2010 .

[38]  Dirk Heerwegh,et al.  Mode Differences Between Face-to-Face and Web Surveys: An Experimental Investigation of Data Quality and Social Desirability Effects , 2009 .

[39]  Anja S. Göritz,et al.  Incentives in Web Studies: Methodological Issues and a Review , 2006 .

[40]  D. Leeuw,et al.  Data Quality in Mail, Telephone and Face to Face Surveys. , 1992 .