A quasi-experimental assessment of interactive student response systems on student confidence, effort, and course performance

Abstract The interactive student response system (SRS), commonly referred to as ‘clickers,’ is an alternative learning method that has the potential to improve student course (i.e., quiz/examination) performance. Prior SRS studies both within accounting and other academic disciplines have found conflicting results as to its influence on student course performance. This quasi-experimental study re-examines the relationship between the use of an SRS and course performance. We also investigate how using SRS influences student confidence and time spent studying outside of class. Unlike prior SRS related studies, we tested both our SRS class and our control class (with no SRS) in the same academic semester with the same instructor to provide a higher degree of experimental control. Through doing so, we compared the benefit of immediate feedback achieved by SRS to the delayed feedback of traditional assessment formats. Higher in-class performance on multiple-choice quiz items was found for students using SRS versus those who did not use SRS; however, no significant differences in examination performance or overall course performance were noted between the two groups. Students using SRS reported being more confident in their abilities and spent less time preparing for the course outside of class, while maintaining similar overall course performance when compared to those who did not use the SRS. We conclude our study by providing areas of meaningful future research related to the use of SRS.

[1]  Philip G. Cottell Classroom research in accounting: Assessing for learning , 1991 .

[2]  M. Horsburgh,et al.  Quality Monitoring in Higher Education: the impact on student learning , 1999 .

[3]  Loretta L. Jones,et al.  A Review of Literature Reports of Clickers Applicable to College Chemistry Classrooms , 2008 .

[4]  Dennis E. Clayson,et al.  Performance Overconfidence: Metacognitive Effects or Misplaced Student Expectations? , 2005 .

[5]  Thomas A. Angelo,et al.  Classroom Assessment Techniques: A Handbook for College Teachers. Second Edition. , 1993 .

[6]  C. Bonwell,et al.  Active learning : creating excitement in the classroom , 1991 .

[7]  Christopher T. Edmonds,et al.  An Empirical Investigation of the Effects of SRS Technology on Introductory Managerial Accounting Students , 2008 .

[8]  Clifford Nowell,et al.  I Thought I Got an A! Overconfidence Across the Economics Curriculum , 2007 .

[9]  Jane E Caldwell,et al.  Clickers in the large classroom: current research and best-practice tips. , 2007, CBE life sciences education.

[10]  Stanislav Karapetrovic,et al.  An integrated system for educational performance measurement, modeling and management at the classroom level , 2005 .

[11]  Carla Carnaghan,et al.  Investigating the Effects of Group Response Systems on Student Satisfaction, Learning and Engagement in Accounting Education , 2007 .

[12]  William A. Stock,et al.  The Effects of Feedback Timing on Learning Facts: The Role of Response Confidence , 1994 .

[13]  S. Siegel,et al.  Nonparametric Statistics for the Behavioral Sciences , 2022, The SAGE Encyclopedia of Research Design.

[14]  Naser Z. Alsharif,et al.  Personal Response System and Student Learning. , 2007 .

[15]  Robin H. Kay,et al.  Examining the benefits and challenges of using audience response systems: A review of the literature , 2009, Comput. Educ..

[16]  Introduction and Overview: From Classroom Assessment to Classroom Research. , 1991 .

[17]  James Oigara,et al.  Interdisciplinary Journal of E-learning and Learning Objects Teaching and Learning with Clickers: Are Clickers Good for Students? , 2022 .

[18]  Jill A. Marshall,et al.  Classroom Response Systems: A Review of the Literature , 2006 .

[19]  Janice L. Ammons,et al.  Course‐Embedded Assessments for Evaluating Cross‐Functional Integration and Improving the Teaching‐Learning Process , 2005 .

[20]  P. Booth,et al.  The quality of learning in accounting education: the impact of approaches to learning on academic performance , 1999 .

[21]  Abdel K. Halabi,et al.  Empirical Evidence on the Relative Efficiency of Worked Examples versus Problem‐Solving Exercises in Accounting Principles Instruction , 2005 .

[22]  Joanna Bull,et al.  Assessing student learning in higher education , 1997 .

[23]  Paul M. Goldwater,et al.  Beyond just desserts: The gendered nature of the connection between effort and achievement for accounting students , 2010 .

[24]  Michael J. Renner,et al.  But I thought I knew that: using confidence estimation as a debiasing technique to improve classroom performance , 2001 .

[25]  James A. Kulik,et al.  Effects of Frequent Classroom Testing. , 1991 .

[26]  C. Bonwell,et al.  Active Learning: Creating Excitement in the Classroom. ERIC Digest. , 1991 .

[27]  Marie Kavanagh,et al.  Click Go the Students, Click-Click-Click: The Efficacy of a Student Response System for Engaging Students to Improve Feedback and Performance , 2009 .

[28]  Gordon Boyce,et al.  Fostering deep and elaborative learning and generic (soft) skill development: the strategic use of case studies in accounting education , 2001 .

[29]  E. Mory Feedback research revisited. , 2004 .