TY - JOUR AU - Noraset, Thanapon AU - Demeter, David AU - Downey, Doug PY - 2018/04/27 Y2 - 2024/03/29 TI - Controlling Global Statistics in Recurrent Neural Network Text Generation JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 32 IS - 1 SE - Main Track: NLP and Machine Learning DO - 10.1609/aaai.v32i1.11993 UR - https://ojs.aaai.org/index.php/AAAI/article/view/11993 SP - AB - <p> Recurrent neural network language models (RNNLMs) are an essential component for many language generation tasks such as machine translation, summarization, and automated conversation. Often, we would like to subject the text generated by the RNNLM to constraints, in order to overcome systemic errors (e.g. word repetition) or achieve application-specific goals (e.g. more positive sentiment). In this paper, we present a method for training RNNLMs to simultaneously optimize likelihood and follow a given set of statistical constraints on text generation.  The problem is challenging because the statistical constraints are defined over aggregate model behavior, rather than model parameters, meaning that a straightforward parameter regularization approach is insufficient.  We solve this problem using a dynamic regularizer that updates as training proceeds, based on the generative behavior of the RNNLMs.  Our experiments show that the dynamic regularizer outperforms both generic training and a static regularization baseline.  The approach is successful at improving word-level repetition statistics by a factor of four in RNNLMs on a definition modeling task.  It also improves model perplexity when the statistical constraints are $n$-gram statistics taken from a large corpus. </p> ER -