Some Improvements of Deep Knowledge Tracing

Deep Knowledge Tracing (DKT), along with other machine learning approaches, are biased toward data used during the training step. Thus, for problems where we have few amounts of data for training, the generalization power will be low, the models will tend to give good results on classes containing many examples and poor results on those with few examples. Theses problems are frequent in educational data where for example, there are skills that are very difficult (floor) or very easy to master (ceiling). There will be less data on students that correctly answered questions related to difficult knowledge and that incorrectly answered questions related to knowledge easy to master. In that case, the DKT is unable to correctly predict the student's answers to questions associated with those skills. To improve the DKT, we penalize the model using a 'cost-sensitive' technique. To overcome the problem of the few amounts of data, we propose a hybrid model combining the DKT and expert knowledge. Thus, the DKT is combined with a Bayesian Network (built from domain experts) by using the attention mechanism. The resulting model can accurately track knowledge of students in Logic-Muse Intelligent Tutoring System (ITS), compared to the BKT and the original DKT.

[1]  Leonidas J. Guibas,et al.  Deep Knowledge Tracing , 2015, NIPS.

[2]  Roger Nkambou,et al.  Hybrid Deep Neural Networks to Predict Socio-Moral Reasoning Skills , 2019, EDM.

[3]  Jussi Kasurinen,et al.  Estimating programming knowledge with Bayesian knowledge tracing , 2009, ITiCSE.

[4]  Ivan Laptev,et al.  Learning and Transferring Mid-level Image Representations Using Convolutional Neural Networks , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[5]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[6]  Roger Nkambou,et al.  Towards an Intelligent Tutoring System for Logical Reasoning in Multiple Contexts , 2015, EC-TEL.

[7]  Nkambou Roger,et al.  Semi-Supervised Multimodal Deep Learning Model for Polarity Detection in Arguments , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[8]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Cristina Conati,et al.  Using Bayesian Networks to Manage Uncertainty in Student Modeling , 2002, User Modeling and User-Adapted Interaction.

[10]  Roger Nkambou,et al.  A Bayesian Network for the Cognitive Diagnosis of Deductive Reasoning , 2016, EC-TEL.

[11]  Kenneth R. Koedinger,et al.  Individualized Bayesian Knowledge Tracing Models , 2013, AIED.

[12]  Jacqueline Bourdeau,et al.  Advances in Intelligent Tutoring Systems , 2010 .

[13]  Hongjun Lu,et al.  Effective Data Mining Using Neural Networks , 1996, IEEE Trans. Knowl. Data Eng..

[14]  Chris Piech,et al.  Deep Knowledge Tracing On Programming Exercises , 2017, L@S.

[15]  John R. Anderson,et al.  Knowledge tracing: Modeling the acquisition of procedural knowledge , 2005, User Modeling and User-Adapted Interaction.

[16]  Yoshua Bengio,et al.  Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.

[17]  Kevin B. Korb,et al.  Incorporating expert knowledge when learning Bayesian network structure: A medical case study , 2011, Artif. Intell. Medicine.

[18]  Neil T. Heffernan,et al.  Incorporating Rich Features into Deep Knowledge Tracing , 2017, L@S.

[19]  Dit-Yan Yeung,et al.  Addressing two problems in deep knowledge tracing via prediction-consistent regularization , 2018, L@S.

[20]  Yan Wang,et al.  DeepContour: A deep convolutional feature learned by positive-sharing loss for contour detection , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).