Unpacking and repacking the factors affecting students' perceptions of the use of classroom communication systems (CCS) technology

This exploratory study investigated the relationships between students' perceptions of their classroom experiences and instructional and contextual factors involved with the use of Classroom Communication Systems (CCS) technology. A mixed methods approach was employed to examine these relationships using data collected from 931 students enrolled in one public university. Thematic analysis explored students' perceptions of the use of CCS. Three logit models with a sound predictability and model fit were established using logistic sequential regression. These models identified crucial instructional and contextual factors and examined the degree to which each of these factors was associated with student perceptions of classroom experiences with CCS. This study found that positive student perceptions of classroom experience with CCS were closely associated with the use of specific types of questions, formative feedback and assessment approaches, and the pedagogical training of the classroom instructors. These findings elucidate specific aspects of CCS use that are related to how students perceive the effectiveness of this technology, and may ultimately assist instructional designers and faculty developers in designing and implementing CCS to enhance students' classroom experiences. A Mixed methods embedded with thematic and logistic regression analyses were used.Instructors' pedagogical training was identified as the most important factor.Instructors' clicker questions were related to students' positive experiences.Clicker's reliability and affordability were identified as negative factors.

[1]  K. Anthis,et al.  Is It the Clicker, or Is It the Question? Untangling the Effects of Student Response System Use , 2011 .

[2]  Carl E. Wieman,et al.  Why Not Try a Scientific Approach to Science Education? , 2007 .

[3]  Jill A. Marshall,et al.  Classroom Response Systems: A Review of the Literature , 2006 .

[4]  N. Lasry,et al.  Peer instruction: From Harvard to the two-year college , 2008 .

[5]  Michael E. Lantz,et al.  The use of 'Clickers' in the classroom: Teaching innovation or merely an amusing novelty? , 2010, Comput. Hum. Behav..

[6]  Michele E Barbour,et al.  Electronic voting in dental materials education: the impact on students' attitudes and exam performance. , 2008, Journal of dental education.

[7]  R. Kay,et al.  Exploring Individual Differences in Attitudes toward Audience Response Systems , 2009 .

[8]  Alfred Joseph Lizzio,et al.  University Students' Perceptions of the Learning Environment and Academic Outcomes: Implications for theory and practice , 2002 .

[9]  J. Burkell,et al.  The dilemma of survey nonresponse , 2003 .

[10]  P. Petegem,et al.  The impact of instructional development in higher education: the state-of-the-art of the research , 2010 .

[11]  A. Biglan The characteristics of subject matter in different academic areas. , 1973 .

[12]  Bruce Thompson,et al.  Psychometric Properties of Scores from the Web-based LibQUAL+ Study of Perceptions of Library Service Quality , 2001, Libr. Trends.

[13]  Margaret I. Brown,et al.  Increasing interactivity in lectures using an electronic voting system , 2004, J. Comput. Assist. Learn..

[14]  Robert J. Dufresne,et al.  Assessing-To-Learn: Formative Assessment in Physics Instruction , 2004 .

[15]  Bill Rogers,et al.  Context matters: increasing understanding with interactive Clicker Case studies , 2011 .

[16]  Gene G. Byrd,et al.  Exploring the Universe Together: Cooperative Quizzes with and without a Classroom Performance System in Astronomy 101. , 2004 .

[17]  Mark C. James The effect of grading incentive on student discourse in Peer Instruction , 2006 .

[18]  C. Teddlie,et al.  SAGE Handbook of Mixed Methods in Social & Behavioral Research , 2010 .

[19]  Katharine D. Owens,et al.  Using Conceptests to Assess and Improve Student Conceptual Understanding in Introductory Geoscience Courses , 2006 .

[20]  Anne Foegen,et al.  Group Response Technology in Lecture-Based Instruction: Exploring Student Engagement and Instructor Perceptions , 1999 .

[21]  Carla Carnaghan,et al.  Investigating the Effects of Group Response Systems on Student Satisfaction, Learning and Engagement in Accounting Education , 2007 .

[22]  N. Entwistle,et al.  Taking stock: An overview of research findings , 2010 .

[23]  Helena Seli,et al.  "Clickers" and metacognition: A quasi-experimental comparative study about metacognitive self-regulation and use of electronic feedback devices , 2013, Comput. Educ..

[24]  Ian D. Beatty,et al.  Designing effective questions for classroom response system teaching , 2005, physics/0508114.

[25]  B. Tabachnick,et al.  Using Multivariate Statistics , 1983 .

[26]  Jane E Caldwell,et al.  Clickers in the large classroom: current research and best-practice tips. , 2007, CBE life sciences education.

[27]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[28]  D. Hosmer,et al.  Applied Logistic Regression , 1991 .

[29]  Kirsten Crossgrove,et al.  Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. , 2008, CBE life sciences education.

[30]  Michele H. Jackson,et al.  The learning environment in clicker classrooms: student processes of learning and involvement in large university‐level courses using student response systems , 2007 .

[31]  William J. Gerace,et al.  Technology-Enhanced Formative Assessment: A Research-Based Pedagogy for Teaching Science with Classroom Response Technology , 2009 .

[32]  Robin H. Kay,et al.  Examining the benefits and challenges of using audience response systems: A review of the literature , 2009, Comput. Educ..

[33]  Adrian Kirkwood,et al.  The influence upon design of differing conceptions of teaching and learning with technology , 2012 .

[34]  B. Ashar,et al.  Evaluation of an audience response system for the continuing education of health professionals , 2003, The Journal of continuing education in the health professions.

[35]  Lorena Blasco-Arcas,et al.  Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance , 2013, Comput. Educ..

[36]  Scott R. Homan,et al.  Student evaluation of audience response technology in large lecture classes , 2008 .

[37]  Eric Mazur,et al.  Peer Instruction: A User's Manual , 1996 .

[38]  Anna Genell,et al.  Bias in odds ratios by logistic regression modelling and sample size , 2009, BMC medical research methodology.

[39]  G. Boulton‐Lewis Teaching for quality learning at university , 2008 .

[40]  J. P. Rushton,et al.  Teacher personality traits and student instructional ratings in six types of university courses , 1990 .

[41]  Lisa Greer,et al.  Real-Time Analysis of Student Comprehension: An Assessment of Electronic Student Response Technology in an Introductory Earth Science Course , 2004 .

[42]  Adam Finkelstein,et al.  Understanding the effects of professors' pedagogical development with Clicker Assessment and Feedback technologies and the impact on students' engagement and learning in higher education , 2013, Comput. Educ..

[43]  Russel L. Thompson,et al.  A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys , 2000 .

[44]  J. Fereday,et al.  Demonstrating Rigor Using Thematic Analysis: A Hybrid Approach of Inductive and Deductive Coding and Theme Development , 2006 .

[45]  D. Kember Promoting student-centred forms of learning across an entire university , 2009 .

[46]  A. Biglan Relationships between subject matter characteristics and the structure and output of university departments. , 1973 .

[47]  W. G. Perry Forms of Intellectual and Ethical Development in the College Years: A Scheme. Jossey-Bass Higher and Adult Education Series. , 1970 .

[48]  Jerry Chih-Yuan Sun,et al.  Influence of polling technologies on student engagement: An analysis of student motivation, academic performance, and brainwave data , 2014, Comput. Educ..

[49]  Federica Barbieri,et al.  What Are They Talking about? Lessons Learned from a Study of Peer Instruction. , 2008 .

[50]  Erika G. Offerdahl,et al.  Changes in instructors' assessment thinking related to experimentation with new strategies , 2011 .

[51]  Daniel Zingaro,et al.  Peer Instruction in computing: The value of instructor intervention , 2014, Comput. Educ..

[52]  Matthew J. Koehler,et al.  Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge , 2006, Teachers College Record: The Voice of Scholarship in Education.

[53]  Abbas Tashakkori,et al.  Mixed Methodology: Combining Qualitative and Quantitative Approaches , 1998 .

[54]  Carlos Gonzalez,et al.  Conceptions of, and approaches to, teaching online: a study of lecturers teaching postgraduate distance courses , 2009 .

[55]  Peter White,et al.  Clicking for grades? Really? Investigating the use of clickers for awarding grade-points in post-secondary education , 2011, Interact. Learn. Environ..

[56]  Marcie N. Desrochers,et al.  Effect of answer format and review method on college students' learning , 2012, Comput. Educ..

[57]  J. Biggs,et al.  Teaching For Quality Learning At University , 1999 .

[58]  Shannon D. Willoughby,et al.  Technology talks: Clickers and grading incentive in the large lecture hall , 2009 .

[59]  Frederick Reif,et al.  Scientific approaches to science education , 1986 .