Learning Concept Prerequisite Relation via Global Knowledge Relation Optimization

Authors

  • Miao Zhang School of Computer Science and Information Engineering, Hubei University, China Hubei Key Laboratory of Big Data Intelligent Analysis and Application, China
  • Jiawei Wang School of Computer Science and Information Engineering, Hubei University, China Key Laboratory of Intelligent Sensing System and Security, Ministry of Education, China
  • Kui Xiao School of Computer Science and Information Engineering, Hubei University, China Key Laboratory of Intelligent Sensing System and Security, Ministry of Education, China
  • Shihui Wang School of Computer Science and Information Engineering, Hubei University, China Key Laboratory of Intelligent Sensing System and Security, Ministry of Education, China
  • Yan Zhang School of Computer Science and Information Engineering, Hubei University, China Key Laboratory of Intelligent Sensing System and Security, Ministry of Education, China
  • Hao Chen School of Computer Science and Information Engineering, Hubei University, China Key Laboratory of Intelligent Sensing System and Security, Ministry of Education, China
  • Zhifei Li School of Computer Science and Information Engineering, Hubei University, China Hubei Key Laboratory of Big Data Intelligent Analysis and Application, China

DOI:

https://doi.org/10.1609/aaai.v39i2.32156

Abstract

Learning concept prerequisite relations helps better master and build a logically coherent knowledge structure. Many studies use graph neural networks to create heterogeneous knowledge networks that enhance concept representations. However, different types of relations in these networks can influence each other. Existing research often focuses solely on concept relations, neglecting other types of knowledge connections. To address this issue, this paper proposes a novel concept prerequisite relation learning model, named the Global Knowledge Relation Optimization Model(GKROM). Specifically, we capture the impact of different knowledge relation types on document and concept semantic representations separately, integrating the document and concept semantic representations. Then, we introduce multi-objective learning to optimize the knowledge relation network from a global perspective. Through the above optimization, GKROM learns richer semantic representations for concepts and documents, improving the accuracy of concept prerequisite relation learning. Extensive experiments on public datasets demonstrate the effectiveness of our GKROM, achieving state-of-the-art performance in concept prerequisite relation learning.

Published

2025-04-11

How to Cite

Zhang, M., Wang, J., Xiao, K., Wang, S., Zhang, Y., Chen, H., & Li, Z. (2025). Learning Concept Prerequisite Relation via Global Knowledge Relation Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 39(2), 1638–1646. https://doi.org/10.1609/aaai.v39i2.32156

Issue

Section

AAAI Technical Track on Cognitive Modeling & Cognitive Systems