Multi-factor Memory Attentive Model for Knowledge Tracing

Congjie Liu, Xiaoguang Li
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:856-869, 2021.

Abstract

The traditional knowledge tracing with neural network usually embeds the required information and predicates the knowledge proficiency by embedded information. Only few information, however, is considered in traditional methods, such as the information of exercises in terms of concept. In this paper, we propose a multi-factor memory attentive model for knowledge tracing (MMAKT). In terms of Neural Cognitive Diagnosis (NeuralCD) framework, MMAKT introduces the factors of the knowledge concept relevancy, the difficulty of each concept, the discrimination among exercises and the student’s proficiency to construct interaction vectors. Moreover, in order to achieve more accurate prediction precision, MMAKT introduces attention mechanism to enhance the expression of historical relationship between interactions. With the experiments on the real-world datasets, MMAKT shows better performance of knowledge tracing and prediction in comparision with the state-of-the-art approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v157-liu21c, title = {Multi-factor Memory Attentive Model for Knowledge Tracing}, author = {Liu, Congjie and Li, Xiaoguang}, booktitle = {Proceedings of The 13th Asian Conference on Machine Learning}, pages = {856--869}, year = {2021}, editor = {Balasubramanian, Vineeth N. and Tsang, Ivor}, volume = {157}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v157/liu21c/liu21c.pdf}, url = {https://proceedings.mlr.press/v157/liu21c.html}, abstract = {The traditional knowledge tracing with neural network usually embeds the required information and predicates the knowledge proficiency by embedded information. Only few information, however, is considered in traditional methods, such as the information of exercises in terms of concept. In this paper, we propose a multi-factor memory attentive model for knowledge tracing (MMAKT). In terms of Neural Cognitive Diagnosis (NeuralCD) framework, MMAKT introduces the factors of the knowledge concept relevancy, the difficulty of each concept, the discrimination among exercises and the student’s proficiency to construct interaction vectors. Moreover, in order to achieve more accurate prediction precision, MMAKT introduces attention mechanism to enhance the expression of historical relationship between interactions. With the experiments on the real-world datasets, MMAKT shows better performance of knowledge tracing and prediction in comparision with the state-of-the-art approaches.} }
Endnote
%0 Conference Paper %T Multi-factor Memory Attentive Model for Knowledge Tracing %A Congjie Liu %A Xiaoguang Li %B Proceedings of The 13th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Vineeth N. Balasubramanian %E Ivor Tsang %F pmlr-v157-liu21c %I PMLR %P 856--869 %U https://proceedings.mlr.press/v157/liu21c.html %V 157 %X The traditional knowledge tracing with neural network usually embeds the required information and predicates the knowledge proficiency by embedded information. Only few information, however, is considered in traditional methods, such as the information of exercises in terms of concept. In this paper, we propose a multi-factor memory attentive model for knowledge tracing (MMAKT). In terms of Neural Cognitive Diagnosis (NeuralCD) framework, MMAKT introduces the factors of the knowledge concept relevancy, the difficulty of each concept, the discrimination among exercises and the student’s proficiency to construct interaction vectors. Moreover, in order to achieve more accurate prediction precision, MMAKT introduces attention mechanism to enhance the expression of historical relationship between interactions. With the experiments on the real-world datasets, MMAKT shows better performance of knowledge tracing and prediction in comparision with the state-of-the-art approaches.
APA
Liu, C. & Li, X.. (2021). Multi-factor Memory Attentive Model for Knowledge Tracing. Proceedings of The 13th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 157:856-869 Available from https://proceedings.mlr.press/v157/liu21c.html.

Related Material