Towards Interpretable Automated Machine Learning for STEM Career Prediction



Published Aug 22, 2020
Ruitao Liu Aixin Tan


In this paper, we describe our solution to predict student STEM career choices during the 2017 ASSISTments Datamining Competition. We built a machine learning system that automatically reformats the data set, generates new features and prunes redundant ones, and performs model and feature selection. We designed the system to automatically find a model that optimizes prediction performance, yet the final model is a simple logistic regression that allows researchers to discover important features and study their effects on STEM career choices. We also compared our method to other methods, which revealed that the key to good prediction is proper feature enrichment in the beginning stage of the data analysis, while feature selection in a later stage allows a simpler final model.

How to Cite

Liu, R., & Tan, A. (2020). Towards Interpretable Automated Machine Learning for STEM Career Prediction. JEDM | Journal of Educational Data Mining, 12(2), 19-32.
Abstract 37 | PDF Downloads 31



STEM careers, automated prediction, penalized logistic regression, forward-backward search algorithm, interpretable machine learning

BAKER, R., BERNING, A.W., GOWDA, S. M., ZHANG, S. and HAWN, A. 2019. Predicting K-12 Dropout. Journal of Education for Students Placed at Risk (JESPAR), 25 (1), 28-54, DOI: 10.1080/10824669.2019.1670065

BREHENY, P. and HUANG, J. 2011. Coordinate Descent Algorithms for Nonconvex Penalized Regression, with Applications to Biological Feature Selection. Annals of Applied Statistics, 5, 232-253.

CORBETT, A. T. and ANDERSON, J. R. 1995. Knowledge Tracing: Modeling the Acquisition of Procedural Knowledge. User Modeling and User-Adapted Interaction, 4 (4), 253-278.

FAN, J. and LI, R. 2001. Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties. Journal of the American Statistical Association, 96, 1348-1360.

FENG, M., HEFFERNAN, N. and KOEDINGER, K. 2009. Addressing the Assessment Challenge with an Online System That Tutors as it Assesses. User Modeling and User-Adapted Interaction: The Journal of Personalization Research, 19 (3), 243-266.

FRIEDMAN, J. 2001. Greedy Function Approximation: A Gradient Boosting Machine. The Annals of Statistics, 29 (5), 1189-1232. 14HOERL, A.E., and KENNARD, R.W. 1970. Ridge Regression: Biased Estimation for Non- orthogonal Problems. Technometrics, 12, 55-67.

KNOWLES, J. E. 2014. EWStools: Tools for Automating the Testing and Evaluation of Education Early Warning System Models. R package version 0.1.

KNOWLES, J. E. 2015. Of Needles and Haystacks: Building an Accurate Statewide Dropout Early Warning System in Wisconsin. Journal of Educational Data Mining, 7 (3), 18-67.

PARDOS, Z.A., BAKER, R.S., S AN PEDRO, M.O.C.Z., GOWDA, Sujith M. and G OWDA, Supreeth M. 2014. Affective States and State tests: Investigating How Affect and Engagement during the School Year Predict End-of-Year Learning Outcomes. Journal of Learning Analytics, 1 (1), 107-128.

RAZZAQ, L., HEFFERNAN, N.T., FENG, M., and PARDOS, Z.A. 2007. Developing Fine-Grained Transfer Models in the ASSISTment System. Journal of Technology, Instruction, Cognition, and Learning, 5 (3). Old City Publishing, Philadelphia, PA. 2007. 289-304.

R Core Team. 2017. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL:

SAN PEDRO, M.O.C.Z., BAKER, R. S., BOWERS, A., and HEFFERNAN, N. 2013. Predicting College Enrollment from Student Interaction with an Intelligent Tutoring System in Middle School. In Proceedings of the 6th International Conference on Educational Data Mining, 177–184.

SAN PEDRO, M.O.C.Z., BAKER, R. S., and RODRIGO, M. M. T. 2014. Carelessness and Affect in an Intelligent Tutoring System for Mathematics. International Journal of Artificial Intelligence in Education, 24(2), 189-210.

SAN PEDRO, M.O.C.Z., OCUMPOUGH, J. L., BAKER, R. S., HEFFERNAN, N. 2014. Predicting STEM and non-STEM College Major Enrollment from Middle School Interaction with Mathematics Educational Software. Proceedings of the 7th International Conference on Educational Data Mining, 276-279.

SUGIYAMA, M., KRAULEDAT, M., and M ÜLLER , K.-R. (2007). Covariate Shift Adaptation by Importance Weighted Cross Validation. Journal of Machine Learning Research, 8, 985-1005.

TIBSHIRANI, R. 1996. Regression Shrinkage and Selection via the Lasso. Journal of the Royal Statistical Society. Series B (Methodological) 58 (1), 267-288.

ZHANG, C. 2010. Nearly Unbiased Variable Selection under Minimax Concave Penalty. The Annals of Statistics, 38 (2), 894-942.

ZOU, H. and HASTIE, T. 2005. Regularization and Variable Selection via the Elastic Net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67 (2), 301-320.
Special Issue on ASSISTments Longitudinal Data