Improving Faithfulness in Abstractive Text Summarization with EDUs Using BART (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v38i21.30433Keywords:
Faithfulness In LM, EDUs, Text Summarization, NLP: GenerationAbstract
Abstractive text summarization uses the summarizer’s own words to capture the main information of a source document in a summary. While it is more challenging to automate than extractive text summarization, recent advancements in deep learning approaches and pre-trained language models have improved its performance. However, abstractive text summarization still has issues such as unfaithfulness. To address this problem, we propose a new approach that utilizes important Elementary Discourse Units (EDUs) to guide BART-based text summarization. Our approach showed the improvement in truthfulness and source document coverage in comparison to some previous studies.Downloads
Published
2024-03-24
How to Cite
Delpisheh, N., & Chali, Y. (2024). Improving Faithfulness in Abstractive Text Summarization with EDUs Using BART (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23471–23472. https://doi.org/10.1609/aaai.v38i21.30433
Issue
Section
AAAI Student Abstract and Poster Program