HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
المنصة
الرئيسية
SOTA
تصحيح الأخطاء文法的错误更正 Note: The second part of the response is not in Arabic. It seems there was an error. Here is the correct translation: تصحيح الأخطاء النحوية
Grammatical Error Correction On Conll 2014
Grammatical Error Correction On Conll 2014
المقاييس
F0.5
Precision
Recall
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
F0.5
Precision
Recall
Paper Title
Ensembles of best 7 models + GRECO + GTP-rerank
72.8
83.9
47.5
Pillars of Grammatical Error Correction: Comprehensive Inspection Of Contemporary Approaches In The Era of Large Language Models
Majority-voting ensemble on best 7 models
71.8
83.7
45.7
Pillars of Grammatical Error Correction: Comprehensive Inspection Of Contemporary Approaches In The Era of Large Language Models
GRECO (voting+ESC)
71.12
79.6
49.86
System Combination via Quality Estimation for Grammatical Error Correction
GEC-DI (LM+GED)
69.6
79.2
46.8
Improving Seq2Seq Grammatical Error Correction via Decoding Interventions
Unsupervised GEC + cLang8
69.6
75.0
53.8
Unsupervised Grammatical Error Correction Rivaling Supervised Methods
ESC
69.51
81.48
43.78
Frustratingly Easy System Combination for Grammatical Error Correction
T5
68.87
-
-
A Simple Recipe for Multilingual Grammatical Error Correction
MoECE
67.79
74.29
50.21
Efficient and Interpretable Grammatical Error Correction with Mixture of Experts
SynGEC
67.6
74.7
49.0
SynGEC: Syntax-Enhanced Grammatical Error Correction with a Tailored GEC-Oriented Parser
Sequence tagging + token-level transformations + two-stage fine-tuning (+BERT, RoBERTa, XLNet)
66.5
78.2
41.5
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
LM-Critic
65.8
-
-
LM-Critic: Language Models for Unsupervised Grammatical Error Correction
Sequence tagging + token-level transformations + two-stage fine-tuning (+XLNet)
65.3
77.5
40.1
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
Transformer + Pre-train with Pseudo Data (+BERT)
65.2
-
-
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction
Transformer + Pre-train with Pseudo Data
65.0
-
-
An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction
VERNet
63.7
-
-
Neural Quality Estimation with Multiple Hypotheses for Grammatical Error Correction
BART
63.0
69.9
45.1
Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model
Sequence Labeling with edits using BERT, Faster inference
61.2
-
-
Parallel Iterative Edit Models for Local Sequence Transduction
Copy-augmented Model (4 Ensemble +Denoising Autoencoder)
61.15
71.57
38.65
Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data
Sequence Labeling with edits using BERT, Faster inference (Single Model)
59.7
-
-
Parallel Iterative Edit Models for Local Sequence Transduction
CNN Seq2Seq + Quality Estimation
56.52
-
-
Neural Quality Estimation of Grammatical Error Correction
0 of 23 row(s) selected.
Previous
Next
Grammatical Error Correction On Conll 2014 | SOTA | HyperAI