TY - JOUR AU - Parthiban, Dwarak Govind AU - Mao, Yongyi AU - Inkpen, Diana PY - 2021/05/18 Y2 - 2024/03/29 TI - On the Softmax Bottleneck of Recurrent Language Models JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 15 SE - AAAI Technical Track on Speech and Natural Language Processing II DO - 10.1609/aaai.v35i15.17608 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17608 SP - 13640-13647 AB - Recent research has pointed to a limitation of word-level neural language models with softmax outputs. This limitation, known as the softmax bottleneck refers to the inability of these models to produce high-rank log probability (log P) matrices. Various solutions have been proposed to break this bottleneck, including Mixture of Softmaxes, SigSoftmax, and Linear Monotonic Softmax with Piecewise Linear Increasing Functions. They were reported to offer better performance in terms of perplexity on test data. A natural perception from these results is a strong positive correlation between the rank of the log P matrix and the model's performance. In this work, we show via an extensive empirical study that such a correlation is fairly weak and that the high-rank of the log P matrix is neither necessary nor sufficient for better test perplexity. Although our results are empirical, they are established in part via the construction of a rich family of models, which we call Generalized SigSoftmax. They are able to create diverse ranks for the log P matrices. We also present an investigation as to why the proposed solutions achieve better performance. ER -