Stakeholder Input and Test Design: A Case Study on Changing the Interlocutor Familiarity Facet of the Group Oral Discussion Test

Test takers should have a voice in testing practices (Mathew, 2004). However, when incorporating their input, systematic processes to ensure the validity of testing practices must be followed. Such processes allow for test development to be a more democratic process (Shohamy, 2001), without sacrificing the value of the resulting inferences made from the test scores. This article describes a case study that incorporated the desires of test takers to change the procedures of a group oral discussion test in a university English as a Foreign Language program. A study was designed to determine the extent to which the proposed changes would threaten the validity of the testing process. Specifically, the procedures for the group oral were altered to investigate the effect of interlocutor familiarity. Students were randomly assigned to class-familiar (n = 146) and class-unfamiliar (n = 159) groups to identify to what extent group familiarity affected test takers' scores in the four assessed categories: pronunciation, fluency, lexis and grammar, and communication skills. For the two groups, no statistically significant difference in scores was found, and score reliability estimates were similar. The implications of the findings are addressed in terms of recommendations for using stakeholder input in the assessment design process.

[1]  Zhengdong Gan,et al.  Interaction in group oral assessment: A case study of higher- and lower-scoring students , 2010 .

[2]  J. P. Morgan,et al.  Design and Analysis: A Researcher's Handbook , 2005, Technometrics.

[3]  Dorry M. Kenyon,et al.  The Rating of Direct and Semi‐Direct Oral Proficiency Interviews: Comparing Performance at Lower Proficiency Levels , 2000 .

[4]  Rama Mathew Stakeholder Involvement in Language Assessment: Does it Improve Ethicality? , 2004 .

[5]  Jeff Connor-Linton,et al.  TALKING AND TESTING: DISCOURSE APPROACHES TO THE ASSESSMENT OF ORAL PROFICIENCY.Richard Young and Agnes Weiyun He (Eds.). Amsterdam: Benjamins, 1998. Pp. x + 395. NLG 138 cloth. , 2000, Studies in Second Language Acquisition.

[6]  E. Shohamy Affective Considerations in Language Testing , 1982 .

[7]  Gary J. Ockey,et al.  A many-facet Rasch analysis of the second language group oral discussion task , 2003 .

[8]  Jo A. Lewkowicz Authenticity in language testing: some outstanding questions , 2000 .

[9]  Steven J. Ross,et al.  The Discourse of Accommodation in Oral Proficiency Interviews , 1992, Studies in Second Language Acquisition.

[10]  G. Ockey The effects of group members' personalities on a test taker's L2 group oral discussion test scores , 2009 .

[11]  Lyle F. Bachman,et al.  语言测试实践 = Language testing in practice , 1998 .

[12]  Elana Shohamy,et al.  From Testing Research to Educational Policy: Introducing a New Comprehensive Test of Oral Proficiency. , 1986 .

[13]  Annie Brown,et al.  Interviewer variation and the co-construction of speaking proficiency , 2003 .

[14]  Noriko Iwashita,et al.  Estimating the difficulty of oral proficiency tasks: what does the test-taker have to offer? , 2002 .

[15]  Marjorie C. Kirkland The Effects of Tests on Students and Schools , 1971 .

[16]  Anne Lazaraton,et al.  Interlocutor support in oral proficiency interviews: the case of CASE , 1996 .

[17]  J. Norris Validity Evaluation in Language Assessment , 2008 .

[18]  A. Brown The role of test-taker feedback in the test development process: test-takers' reactions to a tape-mediated test of proficiency in spoken Japanese , 1993 .

[19]  T. Haladyna,et al.  Construct-Irrelevant Variance in High-Stakes Testing. , 2005 .

[20]  C. Scott Stakeholder perceptions of test impact , 2007 .

[21]  Guoxing Yu,et al.  Students' voices in the evaluation of their written summaries: Empowerment and democracy for test takers?: , 2007 .

[22]  Lyle F. Bachman,et al.  Language assessment in practice : developing language assessments and justifying their use in the real world , 2010 .

[23]  Andrea Tyler,et al.  Re-analyzing the OPI: How Much Does It Look Like Natural Conversation? , 1998 .

[24]  P. Rea-Dickins So, why do we need relationships with stakeholders in language testing? A view from the UK , 1997 .

[25]  J. Algina,et al.  Univariate and Multivariate Omnibus Hypothesis Tests Selected to Control Type I Error Rates When Population Variances Are Not Necessarily Equal , 1996 .

[26]  Liying Cheng,et al.  Voices From Test-Takers: Further Evidence for Language Assessment Validation and Use , 2011 .

[27]  E. Shohamy Democratic assessment as an alternative , 2001 .

[28]  Charles W. Stansfield A Comparative Analysis of Simulated and Direct Oral Proficiency Interviews. , 1990 .

[29]  Leo Van Lier,et al.  Reeling, Writhing, Drawling, Stretching, and Fainting in Coils: Oral Proficiency Interviews as Conversation , 1989 .

[30]  Larry Davis,et al.  The influence of interlocutor proficiency in a paired oral assessment , 2009 .

[31]  S. Puntanen,et al.  A STUDY OF THE STATISTICAL FOUNDATIONS OF GROUP CONVERSATION TESTS IN SPOKEN ENGLISH1 , 1983 .

[32]  Constant Leung,et al.  Expanding Horizons and Unresolved Conundrums: Language Testing and Assessment , 2006 .

[33]  James Dean Brown,et al.  语言项目中的测试与评价 = Testing in Language Programs : A Comprehensive Guide to English Language Assessment , 2005 .

[34]  Judit Kormos Simulating conversations in oral-proficiency assessment: a conversation analysis of role plays and non-scripted interviews in language exams , 1999 .

[35]  Evelina D. Galaczi,et al.  Peer–Peer Interaction in a Speaking Test: The Case of the First Certificate in English Examination , 2008 .

[36]  Barry O’Sullivan,et al.  Learner acquaintanceship and oral proficiency test pair-task performance , 2002 .

[37]  Lindsay Brooks,et al.  Interacting in pairs in a test of oral proficiency: Co-constructing a better performance , 2009 .

[38]  A. V. Moere Validity Evidence in a University Group Oral Test. , 2006 .

[39]  Volker Hegelheimer,et al.  Validation of a web-based ESL test , 2003 .