HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
المنصة
الرئيسية
SOTA
تصحيح الأخطاء文法的错误更正 Note: The second part of the response is not in Arabic. It seems there was an error. Here is the correct translation: تصحيح الأخطاء النحوية
Grammatical Error Correction On Bea 2019 Test
Grammatical Error Correction On Bea 2019 Test
المقاييس
F0.5
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
F0.5
Paper Title
Majority-voting ensemble on best 7 models
81.4
Pillars of Grammatical Error Correction: Comprehensive Inspection Of Contemporary Approaches In The Era of Large Language Models
GRECO (voting+ESC)
80.84
System Combination via Quality Estimation for Grammatical Error Correction
ESC
79.90
Frustratingly Easy System Combination for Grammatical Error Correction
RedPenNet
77.60
RedPenNet for Grammatical Error Correction: Outputs to Tokens, Attentions to Spans
clang_large_ft2-gector
77.1
Improved grammatical error correction by ranking elementary edits
Unsupervised GEC + cLang8
76.5
Unsupervised Grammatical Error Correction Rivaling Supervised Methods
DeBERTa + RoBERTa + XLNet
76.05
Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction
MoECE
74.07
Efficient and Interpretable Grammatical Error Correction with Mixture of Experts
Sequence tagging + token-level transformations + two-stage fine-tuning (+RoBERTa, XLNet)
73.7
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
BEA Combination
73.2
Learning to combine Grammatical Error Corrections
GEC-DI (LM+GED)
73.1
Improving Seq2Seq Grammatical Error Correction via Decoding Interventions
LM-Critic
72.9
LM-Critic: Language Models for Unsupervised Grammatical Error Correction
Sequence tagging + token-level transformations + two-stage fine-tuning (+XLNet)
72.4
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
Transformer + Pre-train with Pseudo Data
70.2
An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction
Transformer + Pre-train with Pseudo Data (+BERT)
69.8
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction
Transformer
69.5
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data
Transformer
69.0
A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning
VERNet
68.9
Neural Quality Estimation with Multiple Hypotheses for Grammatical Error Correction
Ensemble of models
66.78
The LAIX Systems in the BEA-2019 GEC Shared Task
0 of 19 row(s) selected.
Previous
Next
Grammatical Error Correction On Bea 2019 Test | SOTA | HyperAI