Sequence tagging + token-level transformations + two-stage fine-tuning (+RoBERTa, XLNet) | 73.7 | GECToR -- Grammatical Error Correction: Tag, Not Rewrite | |
Transformer + Pre-train with Pseudo Data | 70.2 | An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction | |
Transformer + Pre-train with Pseudo Data (+BERT) | 69.8 | Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction | |
Sequence tagging + token-level transformations + two-stage fine-tuning (+XLNet) | 72.4 | GECToR -- Grammatical Error Correction: Tag, Not Rewrite | |