Degree Planning with PLAN-BERT: Multi-Semester Recommendation Using Future Courses of Interest


  • Erzhuo Shao Tsinghua University
  • Shiyuan Guo University of California, Berkeley
  • Zachary A. Pardos University of California, Berkeley





Planning scenarios involving user pre-specified items present themselves frequently in recommender system domains. Although next-item and next-basket recommendation has been a focus of prior research, multiple consecutive item or basket approaches are needed for planning. No prior work has leveraged pre-specified future reference items to improve this type of challenging consecutive prediction task at inference time. PLAN-BERT is the first to accommodate this general planning scenario. It does so by contributing novel modifications that take inspiration from the masked training and contextual embedding of self-attention models. To test the model, we use the domain of student academic degree planning, in which students’ past course histories and future pre-specified courses of interest are used to fill in the remainder of their curriculum. Our offline analyses consist of 15 million historic course enrollments at 20 institutions and an online evaluation conducted at one of the institutions. Our results show that PLAN-BERT outperforms existing models including BERT, BiLSTM, and a UserKNN baseline, with small numbers of future reference items substantially improving accuracy. Significant results from our online evaluation show PLAN-BERT to be strongest in students' perceptions of personalization.




How to Cite

Shao, E., Guo, S., & Pardos, Z. A. (2021). Degree Planning with PLAN-BERT: Multi-Semester Recommendation Using Future Courses of Interest. Proceedings of the AAAI Conference on Artificial Intelligence, 35(17), 14920-14929.



AAAI Special Track on AI for Social Impact