A decision support prototype tool for predicting student performance in an ODL environment

Machine Learning algorithms fed with data sets which include information such as attendance data, test scores and other student information can provide tutors with powerful tools for decision‐making. Until now, much of the research has been limited to the relation between single variables and student performance. Combining multiple variables as possible predictors of dropout has generally been overlooked. The aim of this work is to present a high level architecture and a case study for a prototype machine learning tool which can automatically recognize dropout‐prone students in university level distance learning classes. Tracking student progress is a time‐consuming job which can be handled automatically by such a tool. While the tutors will still have an essential role in monitoring and evaluating student progress, the tool can compile the data required for reasonable and efficient monitoring. What is more, the application of the tool is not restricted to predicting drop‐out prone students: it can be also used for the prediction of students’ marks, for the prediction of how many students will submit a written assignment, etc. It can also help tutors explore data and build models for prediction, forecasting and classification. Finally, the underlying architecture is independent of the data set and as such it can be used to develop other similar tools

[1]  Claus Pahl,et al.  An evaluation technique for content interaction in Web-based teaching and learning environments , 2003, Proceedings 3rd IEEE International Conference on Advanced Technologies.

[2]  Donald J. Winiecki,et al.  A Case Study: Increase Enrollment by Reducing Dropout Rates in Adult Distance Education. , 1998 .

[3]  Sreerama K. Murthy,et al.  Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey , 1998, Data Mining and Knowledge Discovery.

[4]  Rajeev Sharma,et al.  Advances in Neural Information Processing Systems 11 , 1999 .

[5]  JOHANNES FÜRNKRANZ,et al.  Separate-and-Conquer Rule Learning , 1999, Artificial Intelligence Review.

[6]  William W. Cohen Fast Effective Rule Induction , 1995, ICML.

[7]  Panayiotis E. Pintelas,et al.  ON THE SELECTION OF CLASSIFIER-SPECIFIC FEATURE SELECTION ALGORITHMS , 2004 .

[8]  David W. Aha,et al.  A Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms , 1997, Artificial Intelligence Review.

[9]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[10]  Sebastián Ventura,et al.  Discovering Prediction Rules in AHA! Courses , 2003, User Modeling.

[11]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[12]  JapkowiczNathalie,et al.  The class imbalance problem: A systematic study , 2002 .

[13]  Ron Kohavi,et al.  Wrappers for Feature Subset Selection , 1997, Artif. Intell..

[14]  Sotiris B. Kotsiantis,et al.  Preventing Student Dropout in Distance Learning Using Machine Learning Techniques , 2003, KES.

[15]  Ron Kohavi,et al.  Data Mining Using MLC a Machine Learning Library in C++ , 1996, Int. J. Artif. Intell. Tools.

[16]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques with Java implementations , 2002, SGMD.

[17]  Rakesh Agarwal,et al.  Fast Algorithms for Mining Association Rules , 1994, VLDB 1994.

[18]  Ian Witten,et al.  Data Mining , 2000 .

[19]  Panayiotis E. Pintelas,et al.  A survey on student dropout rates and dropout causes concerning the students in the Course of Informatics of the Hellenic Open University , 2002, Comput. Educ..

[20]  Namin Shin,et al.  An Exploration of Learner Progress and Drop-Out in Korea National Open University. , 1999 .

[21]  Pedro M. Domingos,et al.  On the Optimality of the Simple Bayesian Classifier under Zero-One Loss , 1997, Machine Learning.

[22]  Nathalie Japkowicz,et al.  The class imbalance problem: A systematic study , 2002, Intell. Data Anal..