Amplifying Diversity and Quality in Commonsense Knowledge Graph Completion (Student Abstract)

Authors

  • Liu Yu University of Electronic Science and Technology of China
  • Fenghui Tian University of Electronic Science and Technology of China
  • Ping Kuang University of Electronic Science and Technology of China
  • Fan Zhou University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v38i21.30531

Keywords:

Natural Language Processing, Commonsense Knowledge, Knowledge Completion

Abstract

Conventional commonsense knowledge graph completion (CKGC) methods provide inadequate sequence when fine-tuning or generating stages and incorporate full fine-tuning, which fail to align with the autoregressive model's pre-training patterns and have insufficient parameter efficiency. Moreover, decoding through beam or greedy search produces low diversity and high similarity in generated tail entities. Hence, we resort to prefix-tuning and propose a lightweight, effective pipeline to enhance the quality and diversity of extracted commonsense knowledge. Precisely, we measure head entity similarity to yield and then concatenate top-k tuples before each target tuple for prefix-tuning the source LM, thereby improving the efficiency and speed for pretrained models; then, we design a penalty-tailored diverse beam search (p-DBS) for decoding tail entities, producing a greater quantity and diversity of generated commonsense tuples; besides, a filter strategy is utilized to filter out invalid commonsense knowledge. Through extensive automatic evaluations, including ChatGPT scoring, our method can extract diverse, novel, and accurate commonsense knowledge (CK).

Published

2024-03-24

How to Cite

Yu, L., Tian, F., Kuang, P., & Zhou, F. (2024). Amplifying Diversity and Quality in Commonsense Knowledge Graph Completion (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23699-23700. https://doi.org/10.1609/aaai.v38i21.30531