Lexical Simplification with Pretrained Encoders

Authors

  • Jipeng Qiang Yangzhou University
  • Yun Li Yangzhou University
  • Yi Zhu Yangzhou University
  • Yunhao Yuan Yangzhou University
  • Xindong Wu Hefei University of Technology

DOI:

https://doi.org/10.1609/aaai.v34i05.6389

Abstract

Lexical simplification (LS) aims to replace complex words in a given sentence with their simpler alternatives of equivalent meaning. Recently unsupervised lexical simplification approaches only rely on the complex word itself regardless of the given sentence to generate candidate substitutions, which will inevitably produce a large number of spurious candidates. We present a simple LS approach that makes use of the Bidirectional Encoder Representations from Transformers (BERT) which can consider both the given sentence and the complex word during generating candidate substitutions for the complex word. Specifically, we mask the complex word of the original sentence for feeding into the BERT to predict the masked token. The predicted results will be used as candidate substitutions. Despite being entirely unsupervised, experimental results show that our approach obtains obvious improvement compared with these baselines leveraging linguistic databases and parallel corpus, outperforming the state-of-the-art by more than 12 Accuracy points on three well-known benchmarks.

Downloads

Published

2020-04-03

How to Cite

Qiang, J., Li, Y., Zhu, Y., Yuan, Y., & Wu, X. (2020). Lexical Simplification with Pretrained Encoders. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8649-8656. https://doi.org/10.1609/aaai.v34i05.6389

Issue

Section

AAAI Technical Track: Natural Language Processing