Insufficient effort responding: examining an insidious confound in survey data.

Insufficient effort responding (IER; Huang, Curran, Keeney, Poposki, & DeShon, 2012) to surveys has largely been assumed to be a source of random measurement error that attenuates associations between substantive measures. The current article, however, illustrates how and when the presence of IER can produce a systematic bias that inflates observed correlations between substantive measures. Noting that inattentive responses as a whole generally congregate around the midpoint of a Likert scale, we propose that Mattentive, defined as the mean score of attentive respondents on a substantive measure, will be negatively related to IER's confounding effect on substantive measures (i.e., correlations between IER and a given substantive measure will become less positive [or more negative] as Mattentive increases). Results from a personality questionnaire (Study 1) and a simulation (Study 2) consistently support the hypothesized confounding influence of IER. Using an employee sample (Study 3), we demonstrated how IER can confound bivariate relationships between substantive measures. Together, these studies indicate that IER can inflate the strength of observed relationships when scale means depart from the scale midpoints, resulting in an inflated Type I error rate. This challenges the traditional view that IER attenuates observed bivariate correlations. These findings highlight situations where IER may be a methodological nuisance, while underscoring the need for survey administrators and researchers to deter and detect IER in surveys. The current article serves as a wake-up call for researchers and practitioners to more closely examine IER in their data.

[1]  Holly M. Hutchins,et al.  Trainees' Perceived Knowledge Gain Unrelated to the Training Domain: The Joint Action of Impression Management and Motives , 2014 .

[2]  D. Simons The Value of Direct Replication , 2014, Perspectives on psychological science : a journal of the Association for Psychological Science.

[3]  A. Meade,et al.  Identifying careless responses in survey data. , 2012, Psychological methods.

[4]  Paul E. Spector,et al.  The deviant citizen: Measuring potential positive relations between counterproductive work behaviour and organizational citizenship behaviour , 2012 .

[5]  Tara S. Behrend,et al.  The viability of crowdsourcing for survey research , 2011, Behavior research methods.

[6]  R. McGrath,et al.  Evidence for response bias as a source of error variance in applied assessment. , 2010, Psychological bulletin.

[7]  N. Bowling,et al.  A Meta-Analytic Examination of the Construct Validity of the Michigan Organizational Assessment Questionnaire Job Satisfaction Subscale. , 2008 .

[8]  Carol M. Woods Careless Responding to Reverse-Worded Items: Implications for Confirmatory Factor Analysis , 2006 .

[9]  John A. Johnson Ascertaining the validity of individual protocols from Web-based personality inventories. , 2005 .

[10]  Scott B. MacKenzie,et al.  Common method biases in behavioral research: a critical review of the literature and recommended remedies. , 2003, The Journal of applied psychology.

[11]  Thomas M. Stutzman,et al.  AN EVALUATION OF METHODS TO SELECT RESPONDENTS TO STRUCTURED JOB‐ANALYSIS QUESTIONNAIRES , 1986 .

[12]  D. Lykken Statistical significance in psychological research. , 1968, Psychological bulletin.

[13]  R. Buechley,et al.  A new test of validity for the group MMPI. , 1952, Journal of consulting psychology.

[14]  A. H. Brayfield,et al.  AN INDEX OF JOB SATISFACTION , 1951 .