Flexible Non-Autoregressive Extractive Summarization with Threshold: How to Extract a Non-Fixed Number of Summary Sentences

Authors

  • Ruipeng Jia Institute of Information Engineering, Chinese Academy of Sciences School of Cyber Security, University of Chinese Academy of Sciences
  • Yanan Cao Institute of Information Engineering, Chinese Academy of Sciences
  • Haichao Shi Institute of Information Engineering, Chinese Academy of Sciences
  • Fang Fang Institute of Information Engineering, Chinese Academy of Sciences
  • Pengfei Yin Institute of Information Engineering, Chinese Academy of Sciences
  • Shi Wang Institute of Computing Technology, Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v35i14.17552

Keywords:

Summarization

Abstract

Sentence-level extractive summarization is a fundamental yet challenging task, and recent powerful approaches prefer to pick sentences sorted by the predicted probabilities until the length limit is reached, a.k.a. ``Top-K Strategy''. This length limit is fixed based on the validation set, resulting in the lack of flexibility. In this work, we propose a more flexible and accurate non-autoregressive method for single document extractive summarization, extracting a non-fixed number of summary sentences without the sorting step. We call our approach ThresSum as it picks sentences simultaneously and individually from the source document when the predicted probabilities exceed a threshold. During training, the model enhances sentence representation through iterative refinement and the intermediate latent variables receive some weak supervision with soft labels, which are generated progressively by adjusting the temperature with a knowledge distillation algorithm. Specifically, the temperature is initialized with high value and drops along with the iteration until a temperature of 1. Experimental results on CNN/DM and NYT datasets have demonstrated the effectiveness of ThresSum, which significantly outperforms BERTSUMEXT with a substantial improvement of 0.74 ROUGE-1 score on CNN/DM. Our source code will be available on Github.

Downloads

Published

2021-05-18

How to Cite

Jia, R., Cao, Y., Shi, H., Fang, F., Yin, P., & Wang, S. (2021). Flexible Non-Autoregressive Extractive Summarization with Threshold: How to Extract a Non-Fixed Number of Summary Sentences. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14), 13134-13142. https://doi.org/10.1609/aaai.v35i14.17552

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing I