A Theoretical Analysis of the Repetition Problem in Text Generation

Authors

  • Zihao Fu The Chinese University of Hong Kong
  • Wai Lam The Chinese University of Hong Kong
  • Anthony Man-Cho So The Chinese University of Hong Kong
  • Bei Shi Tencent AI Lab

DOI:

https://doi.org/10.1609/aaai.v35i14.17520

Keywords:

Generation, Language Models

Abstract

Text generation tasks, including translation, summarization, language models, and etc. see rapid growth during recent years. Despite the remarkable achievements, the repetition problem has been observed in nearly all text generation models undermining the generation performance extensively. To solve the repetition problem, many methods have been proposed, but there is no existing theoretical analysis to show why this problem happens and how it is resolved. In this paper, we propose a new framework for theoretical analysis for the repetition problem. We first define the Average Repetition Probability (ARP) to characterize the repetition problem quantitatively. Then, we conduct an extensive analysis of the Markov generation model and derive several upper bounds of the average repetition probability with intuitive understanding. We show that most of the existing methods are essentially minimizing the upper bounds explicitly or implicitly. Grounded on our theory, we show that the repetition problem is, unfortunately, caused by the traits of our language itself. One major reason is attributed to the fact that there exist too many words predicting the same word as the subsequent word with high probability. Consequently, it is easy to go back to that word and form repetitions and we dub it as the high inflow problem. Furthermore, we extend our analysis to broader generation models by deriving a concentration bound of the average repetition probability for a general generation model. Finally, based on the theoretical upper bounds, we propose a novel rebalanced encoding approach to alleviate the high inflow problem and thus reducing the upper bound. The experimental results show that our theoretical framework is applicable in general generation models and our proposed rebalanced encoding approach alleviates the repetition problem significantly in both the translation task and the language modeling task. The source code of this paper can be obtained from https://github.com/fuzihaofzh/repetition-problem-nlg.

Downloads

Published

2021-05-18

How to Cite

Fu, Z., Lam, W., So, A. M.-C., & Shi, B. (2021). A Theoretical Analysis of the Repetition Problem in Text Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14), 12848-12856. https://doi.org/10.1609/aaai.v35i14.17520

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing I