HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Grammatical Error Correction
Grammatical Error Correction On Bea 2019 Test
Grammatical Error Correction On Bea 2019 Test
Metrics
F0.5
Results
Performance results of various models on this benchmark
Columns
Model Name
F0.5
Paper Title
Majority-voting ensemble on best 7 models
81.4
Pillars of Grammatical Error Correction: Comprehensive Inspection Of Contemporary Approaches In The Era of Large Language Models
GRECO (voting+ESC)
80.84
System Combination via Quality Estimation for Grammatical Error Correction
ESC
79.90
Frustratingly Easy System Combination for Grammatical Error Correction
RedPenNet
77.60
RedPenNet for Grammatical Error Correction: Outputs to Tokens, Attentions to Spans
clang_large_ft2-gector
77.1
Improved grammatical error correction by ranking elementary edits
Unsupervised GEC + cLang8
76.5
Unsupervised Grammatical Error Correction Rivaling Supervised Methods
DeBERTa + RoBERTa + XLNet
76.05
Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction
MoECE
74.07
Efficient and Interpretable Grammatical Error Correction with Mixture of Experts
Sequence tagging + token-level transformations + two-stage fine-tuning (+RoBERTa, XLNet)
73.7
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
BEA Combination
73.2
Learning to combine Grammatical Error Corrections
GEC-DI (LM+GED)
73.1
Improving Seq2Seq Grammatical Error Correction via Decoding Interventions
LM-Critic
72.9
LM-Critic: Language Models for Unsupervised Grammatical Error Correction
Sequence tagging + token-level transformations + two-stage fine-tuning (+XLNet)
72.4
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
Transformer + Pre-train with Pseudo Data
70.2
An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction
Transformer + Pre-train with Pseudo Data (+BERT)
69.8
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction
Transformer
69.5
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data
Transformer
69.0
A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning
VERNet
68.9
Neural Quality Estimation with Multiple Hypotheses for Grammatical Error Correction
Ensemble of models
66.78
The LAIX Systems in the BEA-2019 GEC Shared Task
0 of 19 row(s) selected.
Previous
Next
Grammatical Error Correction On Bea 2019 Test | SOTA | HyperAI