HyperAI
HyperAI超神経
ホーム
プラットフォーム
ドキュメント
ニュース
論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
利用規約
プライバシーポリシー
日本語
HyperAI
HyperAI超神経
Toggle Sidebar
サイトを検索…
⌘
K
Command Palette
Search for a command to run...
プラットフォーム
ホーム
SOTA
文法誤り訂正
Grammatical Error Correction On Bea 2019 Test
Grammatical Error Correction On Bea 2019 Test
評価指標
F0.5
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
F0.5
Paper Title
Majority-voting ensemble on best 7 models
81.4
Pillars of Grammatical Error Correction: Comprehensive Inspection Of Contemporary Approaches In The Era of Large Language Models
GRECO (voting+ESC)
80.84
System Combination via Quality Estimation for Grammatical Error Correction
ESC
79.90
Frustratingly Easy System Combination for Grammatical Error Correction
RedPenNet
77.60
RedPenNet for Grammatical Error Correction: Outputs to Tokens, Attentions to Spans
clang_large_ft2-gector
77.1
Improved grammatical error correction by ranking elementary edits
Unsupervised GEC + cLang8
76.5
Unsupervised Grammatical Error Correction Rivaling Supervised Methods
DeBERTa + RoBERTa + XLNet
76.05
Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction
MoECE
74.07
Efficient and Interpretable Grammatical Error Correction with Mixture of Experts
Sequence tagging + token-level transformations + two-stage fine-tuning (+RoBERTa, XLNet)
73.7
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
BEA Combination
73.2
Learning to combine Grammatical Error Corrections
GEC-DI (LM+GED)
73.1
Improving Seq2Seq Grammatical Error Correction via Decoding Interventions
LM-Critic
72.9
LM-Critic: Language Models for Unsupervised Grammatical Error Correction
Sequence tagging + token-level transformations + two-stage fine-tuning (+XLNet)
72.4
GECToR -- Grammatical Error Correction: Tag, Not Rewrite
Transformer + Pre-train with Pseudo Data
70.2
An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction
Transformer + Pre-train with Pseudo Data (+BERT)
69.8
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction
Transformer
69.5
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data
Transformer
69.0
A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning
VERNet
68.9
Neural Quality Estimation with Multiple Hypotheses for Grammatical Error Correction
Ensemble of models
66.78
The LAIX Systems in the BEA-2019 GEC Shared Task
0 of 19 row(s) selected.
Previous
Next