Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v34i10.7202Abstract
Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.
Downloads
Published
2020-04-03
How to Cite
Li, Y., Anastasopoulos, A., & Black, A. W. (2020). Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 34(10), 13859-13860. https://doi.org/10.1609/aaai.v34i10.7202
Issue
Section
Student Abstract Track