Cross-Lingual Adversarial Domain Adaptation for Novice Programming
DOI:
https://doi.org/10.1609/aaai.v36i7.20735Keywords:
Machine Learning (ML), Data Mining & Knowledge Management (DMKM), Knowledge Representation And Reasoning (KRR), Cognitive Modeling & Cognitive Systems (CMS)Abstract
Student modeling sits at the epicenter of adaptive learning technology. In contrast to the voluminous work on student modeling for well-defined domains such as algebra, there has been little research on student modeling in programming (SMP) due to data scarcity caused by the unbounded solution spaces of open-ended programming exercises. In this work, we focus on two essential SMP tasks: program classification and early prediction of student success and propose a Cross-Lingual Adversarial Domain Adaptation (CrossLing) framework that can leverage a large programming dataset to learn features that can improve SMP's build using a much smaller dataset in a different programming language. Our framework maintains one globally invariant latent representation across both datasets via an adversarial learning process, as well as allocating domain-specific models for each dataset to extract local latent representations that cannot and should not be united. By separating globally-shared representations from domain-specific representations, our framework outperforms existing state-of-the-art methods for both SMP tasks.Downloads
Published
2022-06-28
How to Cite
Mao, Y., Khoshnevisan, F., Price, T., Barnes, T., & Chi, M. (2022). Cross-Lingual Adversarial Domain Adaptation for Novice Programming. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7682-7690. https://doi.org/10.1609/aaai.v36i7.20735
Issue
Section
AAAI Technical Track on Machine Learning II