Response Style Contamination of Student Evaluation Data

Student evaluation surveys provide instructors with feedback regarding development opportunities and they form the basis of promotion and tenure decisions. Student evaluations have been extensively studied, but one dimension hitherto neglected is the actual measurement aspect: which questions to ask, how to ask them, and what answer options to offer to students to get the most valid results. This study investigates whether cross-cultural response styles affect the validity of student evaluations. If they do, then the student mix in a class can affect an instructor's evaluation, potentially producing biased feedback and prompting inappropriate decisions by university committees. This article discusses two main response styles, demonstrates the nature of the bias they can cause in student evaluation surveys using simulated artificial data, and illustrates three cases based on real student evaluation data in which marketing instructors' teaching quality assessments may be heavily biased because of response styles. The authors propose a simple method to check for response style contamination in student evaluation data and they discuss some practical implications.

[1]  R. Fischer,et al.  Standardization to Account for Cross-Cultural Response Bias , 2004 .

[2]  Bart Cambré,et al.  Adjustment for Acquiescence in the Assessment of the Construct Equivalence of Likert-Type Score Items , 2003 .

[3]  Jean-Paul Fox,et al.  Using Item Response Theory to Measure Extreme Response Style in Marketing Research: A Global Investigation , 2008 .

[4]  K. Krentler,et al.  Measuring Student Expectations and Their Effects on Satisfaction: The Importance of Managing Student Expectations , 2006 .

[5]  S. Dolnicar,et al.  Cross-cultural comparisons of tourist satisfaction: assessing analytical robustness , 2008 .

[6]  M. Zax,et al.  Cultural influences on response style: comparisons of Japanese and American college students. , 1967, The Journal of social psychology.

[7]  W. A. Spivey,et al.  Improving MBA Teaching Evaluation: Insights from Critical Incident Methodology , 1982 .

[8]  Determination of Measurement Scales for Revising or Developing Teacher Evaluation Instruments , 1982 .

[9]  E. Greenleaf Improving Rating Scale Measures by Detecting and Correcting Bias Components in Some Response Styles , 1992 .

[10]  B. Ripley,et al.  An “Unfolding” Latent Variable Model for Likert Attitude Data , 2007 .

[11]  John B. Campbell,et al.  Extreme Response Style in Cross-Cultural Research , 1974 .

[12]  S. Dolnicar,et al.  Cross‐cultural differences in survey response patterns , 2007 .

[13]  E. Greenleaf,et al.  MEASURING EXTREME RESPONSE STYLE , 1992 .

[14]  Laurie A. Babin,et al.  Teaching Portfolios: Uses and Development , 2002 .

[15]  Hans Baumgartner,et al.  Response Styles in Marketing Research: A Cross-National Investigation , 2001 .

[16]  K. Grønhaug,et al.  The Impact of Response Styles in Surveys: A Simulation Study , 1992 .

[17]  D. Clayson Students’ Evaluation of Teaching Effectiveness: Some Implications of Stability , 1999 .

[18]  T. Wilson Student-Faculty Evaluation Forms in Marketing: A Review , 1982 .

[19]  H. Marsh,et al.  Confirmatory factor analyses of Chinese students' evaluations of university teaching , 1998 .

[20]  Gordon W. Cheung,et al.  Assessing Extreme and Acquiescence Response Sets in Cross-Cultural Research Using Structural Equations Modeling , 2000 .

[21]  Christina Lee,et al.  A Cross-Cultural, Between-Gender Study of Extreme Response Style , 1998 .

[22]  Dennis E. Clayson,et al.  Student Evaluations in Marketing: What is Actually being Measured? , 1990 .

[23]  Harry C. Triandis,et al.  Effects of Culture and Response Format on Extreme Response Style , 1989 .

[24]  L. Cronbach Further Evidence on Response Sets and Test Design , 1950 .

[25]  Dennis E. Clayson,et al.  Personality and the Student Evaluation of Teaching , 2006 .

[26]  Irvine Clarke,et al.  Extreme response style in cross‐cultural research , 2001 .

[27]  Y. Poortinga,et al.  Structural Equivalence in Multilevel Research , 2002 .

[28]  Joseph P. Grunenwald,et al.  A Modified Delphi Approach for the Development of Student Evaluations of Faculty Teaching , 1986 .

[29]  Bettina Grün,et al.  Assessing analytical robustness in cross‐cultural comparisons , 2007 .

[30]  G. Marín,et al.  Extreme Response Style and Acquiescence among Hispanics , 1992 .

[31]  David Firth,et al.  Modelling subjective use of an ordinal response scale in a many period crossover experiment , 2002 .

[32]  D. Paulhus Measurement and control of response bias. , 1991 .

[33]  Barbara M. Byrne,et al.  Cross-Cultural Comparisons and the Presumption of Equivalent Measurement and Theoretical Structure , 1999 .

[34]  Judy A. Siguaw,et al.  Student Evaluations of Teaching: An Exploratory Study of the Faculty Response , 2000 .

[35]  Y. Poortinga,et al.  Response Styles in Rating Scales , 2004 .

[36]  An examination of the relationship between values and holiday benefits across cultures using rating scales and best-worst scaling , 2006 .

[37]  Kit-Tai Hau,et al.  Students' evaluations of university teaching: Chinese version of the Students' Evaluations of Educational Quality Instrument. , 1997 .

[38]  Timothy R. Johnson,et al.  On the use of heterogeneous thresholds ordinal regression models to account for individual differences in response style , 2003 .

[39]  Irvine Clarke,et al.  Teaching Internationally: Matching Part-Time MBA Instructional Tools to Host Country Student Preferences , 2002 .