Counterfactual Question Generation Uncovering Learner Contradictions
DOI:
https://doi.org/10.1609/aaai.v40i23.39023Abstract
Conventional feedback, even when accompanied by brief explanations, rarely uncovers the hidden contradictions that trigger a learner's mistake. We bridge this gap with counterfactual question generation (CFQG): given a learner's answer, generate a follow-up question that deliberately contradicts it, compelling the learner to confront the underlying conflict. CFQG thus transforms assessment from passive scoring into an interactive and contradiction-centered dialogue that supports knowledge repair. To automate CFQG, we propose GapProbe, which probes the knowledge gap between a learner’s belief and curated facts through a knowledge graph (KG), then designs counterfactual questions (CFQs) that negate the belief. Identifying contradiction-aware triples, and more importantly, selecting those most likely to confuse the learner, are highly challenging in large-scale KGs. GapProbe tackles these challenges with an iterative ProConB cycle coupled with a schema-aware KGMap. By caching one- and multi-hop schema patterns of the KG, KGMap provides ``roadmap'' to guide LLMs jump to deep and contradiction-aware triples, beyond traditional step-wise graph traversal. We present the CFQG benchmark and corresponding metrics for evaluating how generated CFQs trigger, focus, and deepen learner reflection through explicit contradictions. Experiments on multiple datasets and LLMs show that GapProbe boosts LLM reasoning over KGs and generates follow-up questions that consistently promote deeper and more focused learner reflection.Downloads
Published
2026-03-14
How to Cite
Zhang, B., Yu, H., Dong, W., Yang, Y., Miao, D., Song, F., … Zhou, J. (2026). Counterfactual Question Generation Uncovering Learner Contradictions. Proceedings of the AAAI Conference on Artificial Intelligence, 40(23), 19450–19457. https://doi.org/10.1609/aaai.v40i23.39023
Issue
Section
AAAI Technical Track on Knowledge Representation and Reasoning