Implementing the Interactive Response System in a High School Physics Context: Intervention and Reflections.

The interactive response system (IRS) has been widely used to promote student learning since 2003. It is an electronic system connected to handset devices allowing students to transmit their responses by pressing the desired buttons and meanwhile allowing the teacher to monitor and track individual students' answers anonymously and statistically. However, there is limited research examining the challenges teachers may encounter when designing IRS-based questions and providing mediations which may lead them to develop quality questions. The purpose of this study is to address this research gap by investigating one high school teacher's IRS implementation based on both the teacher's and students' teaching/learning experiences as well as presenting an intervention to help the teacher develop higher quality IRS questions. High quality questions denote questions that are able to help students engage in deeper thinking and eventually lead to comprehensive understanding of the concepts learned. The data sources consist of tests, classroom observations, interviews, face-to-face meetings, and email correspondence. The findings disclose that enhancing the teacher's content knowledge and capability of recognizing the students' learning pitfalls is the foundation to developing quality IRS questions. Collaboration established between the teacher and a university physics education expert appears to have effectively helped both participants gain insights and knowledge into designing quality questions aimed at identifying the students' learning bottlenecks.

[1]  C. R. Yeh,et al.  College students' intention to continue using a personal response system: Deriving a model from four theoretical perspectives , 2012 .

[2]  Matthew J. Koehler,et al.  Introducing Technological Pedagogical Content Knowledge , 2007 .

[3]  David P Maloney,et al.  Surveying students’ conceptual knowledge of electricity and magnetism , 2001 .

[4]  Matthew J. Koehler,et al.  Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge , 2006, Teachers College Record: The Voice of Scholarship in Education.

[5]  Bronwen Cowie,et al.  The Characteristics of Formative Assessment in Science Education. , 2001 .

[6]  Ian D. Beatty,et al.  Designing effective questions for classroom response system teaching , 2005, physics/0508114.

[7]  Robin Kay,et al.  A strategic assessment of audience response systems used in higher education , 2009 .

[8]  Lei Bao,et al.  Single-Concept Clicker Question Sequences , 2011 .

[9]  Piet Lijnse,et al.  “Developmental research” as a way to an empirically based “didactical structure” of science , 1995 .

[10]  Robert J. Dufresne,et al.  QUESTION DRIVEN INSTRUCTION: TEACHING SCIENCE (WELL) WITH AN AUDIENCE RESPONSE SYSTEM , 2005 .

[11]  Peter J. Fensham,et al.  The importance of reflection in improving science teaching and learning , 1991 .

[12]  Roser Pintó,et al.  Introducing curriculum innovations in science: Identifying teachers' transformations and the design of related teacher education , 2005 .

[13]  Yehudit Judy Dori,et al.  How Does Technology-Enabled Active Learning Affect Undergraduate Students' Understanding of Electromagnetism Concepts? , 2005 .

[14]  Eric Zhi-Feng Liu,et al.  Technology enabled active learning (TEAL) in introductory physics: Impact on genders and achievement levels , 2011 .

[15]  Carl Angell,et al.  The role of ‘talking physics’ in an undergraduate physics class using an electronic audience response system , 2010 .

[16]  Gerald Albaum,et al.  Classroom Questioning with Immediate Electronic Response: Do Clickers Improve Learning?. , 2008 .

[17]  L. Shulman Those who Understand: Knowledge Growth in Teaching , 2013 .

[18]  M. Patton Qualitative research & evaluation methods , 2002 .

[19]  Terence M. Hancock,et al.  Use of audience response systems for summative assessment in large classes , 2010 .

[20]  Kumar Laxman,et al.  A study on the adoption of clickers in higher education , 2011 .

[21]  Robert J. Dufresne,et al.  Assessing-To-Learn: Formative Assessment in Physics Instruction , 2004 .

[22]  E. Mazur,et al.  Peer instruction: Getting students to think in class , 2008 .

[23]  Laurence Viennot,et al.  Designing Strategies and Tools for Teacher Training: The Role of Critical Details, Examples in Optics. , 2005 .

[24]  Edward F. Redish,et al.  Making Sense of How Students Make Sense of Mechanical Waves , 1999, physics/0207092.

[25]  Eugenia Etkina,et al.  Pedagogical Content Knowledge and Preparation of High School Physics Teachers. , 2010 .

[26]  Ruey S. Shieh The impact of Technology-Enabled Active Learning (TEAL) implementation on student learning and teachers' teaching in a high school context , 2012, Comput. Educ..

[27]  Margaret I. Brown,et al.  Increasing interactivity in lectures using an electronic voting system , 2004, J. Comput. Assist. Learn..

[28]  J. Krajcik,et al.  Nature, Sources, and Development of Pedagogical Content Knowledge for Science Teaching , 1999 .

[29]  Self- and Cohort-directed Design in Research Training Tutorials for Undergraduate Researchers: Increasing Ownership and Relevance to Improve Learning Outcomes , 2011 .

[30]  Elizabeth Connor,et al.  Using Cases and Clickers in Library Instruction: Designed for Science Undergraduates , 2011 .

[31]  Robert A. Bartsch,et al.  Examining the Effects of an Electronic Classroom Response System on Student Engagement and Performance , 2011 .

[32]  G. Ramsey,et al.  The Professional Development of Science Teachers. , 1976 .

[33]  Yi-Chun Lin,et al.  Implementing clickers to assist learning in science lectures: The Clicker-Assisted Conceptual Change model , 2011 .

[34]  William J. Gerace,et al.  Technology-Enhanced Formative Assessment: A Research-Based Pedagogy for Teaching Science with Classroom Response Technology , 2009 .