Abstract: Communicative approaches to teaching language have emphasized the centrality of oral proficiency in the language acquisition process, but research investigating oral proficiency has been surprisingly limited, yielding an incomplete understanding of spoken language development. This study investigated the development of spoken language at the high school level over five consecutive years, involving more than 1,500 students representing 23 school districts. Quantitative Standards-Based Measure of Proficiency speaking scores and student]produced qualitative spoken samples (n>6,000 samples) contributed to an understanding of the development of spoken language. Hierarchical linear modeling (HLM) revealed a consistent growth trajectory of spoken language development, and results indicated that 18.30% of the variance in student outcomes may be attributed to the teacher variable.Key words: classroom]based research, longitudinal study, mixed methods, oral language developmentThe rise of communicative language learning has led to widespread acceptance of communicative competence as a primary goal of language education and, as such, central to good classroom practice (Savignon, 1997). This approach to language instruction emphasizes the ability to communicate in a second language in real]life situations both inside and beyond the classroom. Instead of measuring language learning in terms of seat time, test scores, or number of credit hours, communicative skills are demonstrated through task]based communicative activities. As a result of this emphasis on oral communication, proficiency has emerged as central to communicative language learning and teaching. However, there is a lack of research at the classroom level that reveals what students are able to do with oral language after one, two, three, and four years of language study. There is a paucity of research relating specifically to the development of spoken language at the secondary level (Tschirner & Heilenman, 1998). Although several studies have offered a glimpse of classroom-based proficiency ratings for the high school language learner (for examples, see Glisan & Foltz, 1998; Huebner & Jensen, 1992; Moeller & Reschke, 1993; Steinmeyer, 1984), the data have been strictly quantitative and have been conducted within educational systems with no consideration of related and potentially confounding factors, such as the teacher, during the exploration of oral language production.Due to such limitations, as well as substantial differences in results, particularly at the beginning levels of language learning in secondary classrooms, this study explored students' progress toward proficiency over a period of 5 years using a combination of qualitative methods, including thematic coding and organization, to reveal overarching trends in oral spoken language, and quantitative methods, including the Standards- Based Measurement of Proficiency (STAMP) test, a teacher-independent, computer- mediated measure of oral language proficiency.1 Purposefully integrating mixed methods offers "a very powerful mix" (Miles & Huberman, 1994, p. 42) that develops "a complex" picture of oral language development (Greene & Caracelli, 1997, p. 7). In choosing this integrative data design, the researchers' purpose was one of complementarity, a design element used to measure overlapping, but distinct, facets of a phenomenon under investigation (Caracelli & Greene, 1993). Results from one method-in this case, qualitative data-were used to enhance, illustrate, or clarify results from the other method-in this case, quantitative data (Greene & McClintock, 1985).Quantitative research questions for this study investigated the growth trajectory of spoken Spanish over four consecutive years of high school Spanish learning. Quantitative questions also delved into the variance in spoken production scores that was attributable to teacher differences or individual student differences. …
[1]
Jennifer Caroline Greene,et al.
Data Analysis Strategies for Mixed-Method Evaluation Designs
,
1993
.
[2]
Andrew D. Cohen,et al.
Strategies in learning and using a second language
,
1998
.
[3]
Eileen W. Glisan,et al.
GUEST EDITORS’ MESSAGE
,
2012
.
[4]
S. Savignon.
Communicative Competence: Theory and Classroom Practice
,
1997
.
[5]
Antony John Kunnan,et al.
Diagnostic Feedback in Language Assessment
,
2009
.
[6]
Isabelle M. Kaplan.
Oral Proficiency Testing and the Language Curriculum: Two Experiments in Curricular Design for Conversation Courses
,
1984
.
[7]
J. Sanford Dugan.
Standardized Tests as an Alternative to the Oral Interview.
,
1988
.
[8]
Michael H. Long.
Methodological Principles for Language Teaching
,
2009
.
[9]
Jennifer Caroline Greene,et al.
Defining and describing the paradigm issue in mixed‐method evaluation
,
1997
.
[10]
J. Norris,et al.
Exploring the Uses and Usefulness of ACTFL Oral Proficiency Ratings and Standards in College Foreign Language Departments.
,
2003
.
[11]
John W. Creswell,et al.
Designing and Conducting Mixed Methods Research
,
2006
.
[12]
Jennifer Caroline Greene,et al.
Triangulation in Evaluation
,
1985
.
[13]
Irene Thompson,et al.
Assessing Foreign Language Skills: Data from Russian
,
1996
.
[14]
Barbara F. Freed.
Preliminary Impressions of the Effects of a Proficiency-Based Language Requirement1
,
1987
.
[15]
Lina Lee.
Digital News Stories: Building Language Learners' Content Knowledge and Speaking Skills.
,
2014
.
[16]
Eileen W. Glisan,et al.
Assessing Students' Oral Proficiency in an Outcome-Based Curriculum: Student Performance and Teacher Intuitions
,
1998
.
[17]
M. Sandelowski,et al.
On Quantitizing
,
2009,
Journal of mixed methods research.
[18]
J. Charles Alderson,et al.
Language test construction and evaluation
,
1995
.
[19]
Tracey M. Derwing,et al.
Oral Fluency: The Neglected Component in the Communicative Language Classroom
,
2010
.
[20]
周彬彬,et al.
Interlanguage : forty years later
,
2014
.
[21]
Andrew D. Cohen,et al.
The Impact of Strategies-Based Instruction on Speaking a Foreign Language. Research Report, October 1995.
,
1995
.
[22]
A. Onwuegbuzie,et al.
Toward a Definition of Mixed Methods Research
,
2007
.
[23]
Sally Sieloff Magnan.
Assessing Speaking Proficiency in the Undergraduate Curriculum: Data from French
,
1986
.
[24]
Erwin Tschirner,et al.
Reasonable Expectations: Oral Proficiency Goals for Intermediate-Level Students of German.
,
1998
.
[25]
Matthew B. Miles,et al.
Qualitative Data Analysis: An Expanded Sourcebook
,
1994
.
[26]
Brian K. Lynch,et al.
Investigating variability in tasks and rater judgements in a performance test of foreign language speaking
,
1995
.
[27]
Aleidine J. Moeller,et al.
A Second Look at Grading and Classroom Performance: Report of a Research Study
,
1993
.
[28]
Paul Whitney,et al.
Developing L2 Oral Proficiency Through Synchronous CMC: Output, Working Memory, and Interlanguage Development.
,
2013
.
[29]
T. Huebner,et al.
A Study of Foreign Language Proficiency‐Based Testing in Secondary Schools
,
1992
.