Deep Knowledge Tracing with Transformers
暂无分享,去创建一个
[1] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[2] Joseph E. Beck,et al. Going Deeper with Deep Knowledge Tracing , 2016, EDM.
[3] Byungsoo Kim,et al. Towards an Appropriate Query, Key, and Value Computation for Knowledge Tracing , 2020, L@S.
[4] Leonidas J. Guibas,et al. Deep Knowledge Tracing , 2015, NIPS.
[5] George Karypis,et al. A Self Attentive model for Knowledge Tracing , 2019, EDM.
[6] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[7] Tomoko Ohkuma,et al. Augmenting Knowledge Tracing by Considering Forgetting Behavior , 2019, WWW.
[8] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[9] Byungsoo Kim,et al. Assessment Modeling: Fundamental Pre-training Tasks for Interactive Educational Systems , 2020, ArXiv.
[10] Michael C. Mozer,et al. How Deep is Knowledge Tracing? , 2016, EDM.
[11] Dit-Yan Yeung,et al. Dynamic Key-Value Memory Networks for Knowledge Tracing , 2016, WWW.