The potential for student performance prediction in small cohorts with minimal available attributes

The measurement of student performance during their progress through university study provides academic leadership with critical information on each student?s likelihood of success. Academics have traditionally used their interactions with individual students through class activities and interim assessments to identify those ?at risk? of failure/withdrawal. However, modern university environments, offering easy on-line availability of course material, may see reduced lecture/tutorial attendance, making such identification more challenging. Modern data mining and machine learning techniques provide increasingly accurate predictions of student examination assessment marks, although these approaches have focussed upon large student populations and wide ranges of data attributes per student. However, many university modules comprise relatively small student cohorts, with institutional protocols limiting the student attributes available for analysis. It appears that very little research attention has been devoted to this area of analysis and prediction. We describe an experiment conducted on a final-year university module student cohort of 23, where individual student data are limited to lecture/tutorial attendance, virtual learning environment accesses and intermediate assessments. We found potential for predicting individual student interim and final assessment marks in small student cohorts with very limited attributes and that these predictions could be useful to support module leaders in identifying students potentially ?at risk.?

[1]  Ian Dunwell,et al.  Foundations of dynamic learning analytics: Using university student data to increase retention , 2015, Br. J. Educ. Technol..

[2]  Aysha Ashraf,et al.  A Comparative Study of Predicting Student’s Performance by use of Data Mining Techniques , 2018 .

[3]  María Jesús Rodríguez-Triana,et al.  Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research , 2017, IEEE Transactions on Learning Technologies.

[4]  Ian Martin,et al.  The structuration of blended learning: putting holistic design principles into practice , 2006, Br. J. Educ. Technol..

[5]  Hendrik Heuer,et al.  Student Success Prediction and the Trade-Off between Big Data and Data Minimization , 2018, DeLFI.

[6]  Daniel R. Marburger,et al.  Absenteeism and Undergraduate Exam Performance , 2001 .

[7]  Antoine Guilmain The General Data Protection Regulation (GDPR) in Europe: Canadian Employer Survival Guide , 2018 .

[8]  Bart Rienties,et al.  Reviewing three case-studies of learning analytics interventions at the open university UK , 2016, LAK.

[9]  Bart Rienties,et al.  The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules , 2016, Comput. Hum. Behav..

[10]  O. Simpson Student retention in distance education: are we failing our students? , 2013 .

[11]  Terrie Lynn Thompson,et al.  Learning analytics: challenges and limitations , 2017 .

[12]  Niall Sclater,et al.  Code of practice for learning analytics , 2015 .

[13]  Ben Daniel,et al.  Big Data and analytics in higher education: Opportunities and challenges , 2015, Br. J. Educ. Technol..

[14]  María Jesús Rodríguez-Triana,et al.  Learning analytics in small-scale teacher-led innovations: Ethical and data privacy issues , 2016 .

[15]  Rebecca Ferguson,et al.  Where is the evidence?: a call to action for learning analytics , 2017, LAK.

[16]  Rebecca Ferguson,et al.  Learning analytics: drivers, developments and challenges , 2012 .

[17]  H. Rangwala,et al.  Learning Analytics in Higher Education , 2021, Advances in Electronic Government, Digital Divide, and Regional Development.

[18]  Linda Corrin,et al.  The Ethics of Learning Analytics in Australian Higher Education: Discussion Paper , 2019 .

[19]  Don J. Webber,et al.  Understanding student attendance in business schools: An exploratory study , 2014 .

[20]  Bart Rienties,et al.  Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK , 2016 .

[21]  Alan Tait,et al.  Global guidelines : Ethics in Learning Analytics , 2019 .

[22]  Mike W. Peacey,et al.  Class Size at University , 2018 .

[23]  Marc Rittberger,et al.  A Federated Reference Structure for Open Informational Ecosystems , 2016 .

[24]  Kam Cheong Li,et al.  Learning Analytics at Low Cost: At-risk Student Prediction with Clicker Data and Systematic Proactive Interventions , 2018, J. Educ. Technol. Soc..

[25]  Ormond Simpson,et al.  Predicting student success in open and distance learning , 2006 .

[26]  George Siemens,et al.  Ethical and privacy principles for learning analytics , 2014, Br. J. Educ. Technol..

[27]  Doug Clow,et al.  The learning analytics cycle: closing the loop effectively , 2012, LAK.

[28]  Yi-Chi Chen,et al.  Determinants and probability prediction of college student retention: new evidence from the Probit model , 2012 .

[29]  Neil Davey,et al.  The Potential for Using Artificial Intelligence Techniques to Improve e-Learning Systems , 2015 .

[30]  Bart Rienties,et al.  A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK , 2017, IxD&A.

[31]  J. Murphy The General Data Protection Regulation (GDPR) , 2018, Irish medical journal.

[32]  Zhongheng Zhang,et al.  Introduction to machine learning: k-nearest neighbors. , 2016, Annals of translational medicine.

[33]  E. Neumayer,et al.  Student satisfaction, league tables and university applications: evidence from Britain , 2015 .

[34]  Kam Cheong Li,et al.  Learning Analytics Intervention: A Review of Case Studies , 2018, 2018 International Symposium on Educational Technology (ISET).