Context-aware Nonlinear and Neural Attentive Knowledge-based Models for Grade Prediction
##plugins.themes.bootstrap3.article.main##
##plugins.themes.bootstrap3.article.sidebar##
Abstract
Grade prediction can help students and their advisers select courses and design personalized degree programs based on predicted future course performance. One of the successful approaches for accu- rately predicting a student's grades in future courses is Cumulative Knowledge-based Regression Models (CKRM). CKRM learns shallow linear models that predict a student's grades as the similarity between his/her knowledge state and the target course. However, there can be more complex interactions among prior courses taken by a student, which cannot be captured by the current linear CKRM model. More- over, CKRM and other grade prediction methods ignore the effect of concurrently-taken courses on a student's performance in a target course. In this paper, we propose context-aware nonlinear and neural attentive models that can potentially better estimate a student's knowledge state from his/her prior course information, as well as model the interactions between a target course and concurrent courses. Compared to the competing methods, our experiments on a large real-world dataset consisting of more than 1.5 million grades show the effectiveness of the proposed models in accurately predicting students' grades. Moreover, the attention weights learned by the neural attentive model can be helpful in better designing their degree plans.
How to Cite
##plugins.themes.bootstrap3.article.details##
grade prediction, neural attentive models, knowledge-based models, degree plans, nonlinear models, undergraduate education
BRAXTON, J. M., HIRSCHY, A. S., AND MCCLENDON, S. A. 2011. Understanding and Reducing College Student Departure: ASHE-ERIC Higher Education Report, Volume 30, Number 3. Vol. 16. John Wiley & Sons.
CHEN, J., ZHANG, H., HE, X., NIE, L., LIU, W., AND CHUA, T.-S. 2017. Attentive collaborative filtering: Multimedia recommendation with item-and component-level attention. In Proceedings of the 40th International ACM SIGIR conference on Research and Development in Information Retrieval. ACM, 335–344.
DUCHI, J., HAZAN, E., AND SINGER, Y. 2011. Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research 12, Jul, 2121–2159.
ELBADRAWY, A. AND KARYPIS, G. 2016. Domain-aware grade prediction and top-n course recommendation. In Proceedings of the 10th ACM Conference on Recommender Systems. ACM, 183–190.
HE, X. AND CHUA, T.-S. 2017. Neural factorization machines for sparse predictive analytics. In Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval. ACM, 355–364.
HE, X., HE, Z., SONG, J., LIU, Z., JIANG, Y.-G., AND CHUA, T.-S. 2018. NAIS: Neural attentive item similarity model for recommendation. IEEE Transactions on Knowledge and Data Engineering 30, 12, 2354–2366.
HU, Q. AND RANGWALA, H. 2018. Course-specific Markovian models for grade prediction. In Pacific Asia Conference on Knowledge Discovery and Data Mining. Springer, 29–41.
KENA, G., HUSSAR, W., MC FARLAND, J., DE BREY, C., MUSU -GILLETTE, L., WANG, X., ZHANG, J., RATHBUN, A., WILKINSON -FLICKER, S., DILIBERTI, M., ET AL . 2016. The condition of education 2016. NCES 2016-144. National Center for Education Statistics.
LAHA, A., CHEMMENGATH, S. A., AGRAWAL, P., KHAPRA, M., SANKARANARAYANAN, K., AND RAMASWAMY, H. G. 2018. On controllable sparse alternatives to softmax. In Advances in Neural Information Processing Systems. 6423–6433.
MARTINS, A. AND ASTUDILLO, R. 2016. From softmax to sparsemax: A sparse model of attention and multi-label classification. In International Conference on Machine Learning. 1614–1623.
MEI, L., REN, P., CHEN, Z., NIE, L., MA, J., AND NIE, J.-Y. 2018. An attentive interaction network for context-aware recommendations. In Proceedings of the 27th ACM International Conference on Information and Knowledge Management. ACM, 157–166.
MORSY, S. AND KARYPIS, G. 2017. Cumulative knowledge-based regression models for next-term grade prediction. In Proceedings of the 2017 SIAM International Conference on Data Mining. SIAM, 552–560.
PARIKH, A., TÄCKSTRÖM, O., DAS, D., AND USZKOREIT, J. 2016. A decomposable attention model for natural language inference. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2249–2255.
POLYZOU, A. AND KARYPIS, G. 2016. Grade prediction with course and student specific models. In Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer. 89-101.
REN, Z., NING, X., AND RANGWALA, H. 2017. Grade prediction with temporal course-wise influence. In Proceedings of the 10th International Conference on Educational Data Mining. 48–55.
REN, Z., NING, X., AND RANGWALA, H. 2018. ALE: Additive latent effect models for grade prediction. In Proceedings of the 2018 SIAM International Conference on Data Mining. SIAM, 477–485.
SWEENEY, M., LESTER, J., RANGWALA, H., AND JOHRI, A. 2016. Next-term student performance prediction: A recommender systems approach. Journal of Educational Data Mining 8, 1, 22–51.
VASWANI, A., SHAZEER, N., PARMAR, N., USZKOREIT, J., JONES, L., GOMEZ, A. N., KAISER, Ł., AND POLOSUKHIN, I. 2017. Attention is all you need. In Advances in Neural Information Processing Systems. 5998–6008.
XIAO, J., YE, H., HE, X., ZHANG, H., WU, F., AND CHUA, T.-S. 2017. Attentional factorization machines: learning the weight of feature interactions via attention networks. In Proceedings of the 26th International Joint Conference on Artificial Intelligence. AAAI Press, 3119–3125.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Authors who publish with this journal agree to the following terms:
- The Author retains copyright in the Work, where the term “Work” shall include all digital objects that may result in subsequent electronic publication or distribution.
- Upon acceptance of the Work, the author shall grant to the Publisher the right of first publication of the Work.
- The Author shall grant to the Publisher and its agents the nonexclusive perpetual right and license to publish, archive, and make accessible the Work in whole or in part in all forms of media now or hereafter known under a Creative Commons 4.0 License (Attribution-Noncommercial-No Derivatives 4.0 International), or its equivalent, which, for the avoidance of doubt, allows others to copy, distribute, and transmit the Work under the following conditions:
- Attribution—other users must attribute the Work in the manner specified by the author as indicated on the journal Web site;
- Noncommercial—other users (including Publisher) may not use this Work for commercial purposes;
- No Derivative Works—other users (including Publisher) may not alter, transform, or build upon this Work,with the understanding that any of the above conditions can be waived with permission from the Author and that where the Work or any of its elements is in the public domain under applicable law, that status is in no way affected by the license.
- The Author is able to enter into separate, additional contractual arrangements for the nonexclusive distribution of the journal's published version of the Work (e.g., post it to an institutional repository or publish it in a book), as long as there is provided in the document an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post online a pre-publication manuscript (but not the Publisher’s final formatted PDF version of the Work) in institutional repositories or on their Websites prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (see The Effect of Open Access). Any such posting made before acceptance and publication of the Work shall be updated upon publication to include a reference to the Publisher-assigned DOI (Digital Object Identifier) and a link to the online abstract for the final published Work in the Journal.
- Upon Publisher’s request, the Author agrees to furnish promptly to Publisher, at the Author’s own expense, written evidence of the permissions, licenses, and consents for use of third-party material included within the Work, except as determined by Publisher to be covered by the principles of Fair Use.
- The Author represents and warrants that:
- the Work is the Author’s original work;
- the Author has not transferred, and will not transfer, exclusive rights in the Work to any third party;
- the Work is not pending review or under consideration by another publisher;
- the Work has not previously been published;
- the Work contains no misrepresentation or infringement of the Work or property of other authors or third parties; and
- the Work contains no libel, invasion of privacy, or other unlawful matter.
- The Author agrees to indemnify and hold Publisher harmless from Author’s breach of the representations and warranties contained in Paragraph 6 above, as well as any claim or proceeding relating to Publisher’s use and publication of any content contained in the Work, including third-party content.