Unsupervised Machine Translation On Wmt2016
評価指標
BLEU
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
モデル名 | BLEU | Paper Title | Repository |
---|---|---|---|
SMT as posterior regularization | 21.7 | Unsupervised Neural Machine Translation with SMT as Posterior Regularization | |
MASS (6-layer Transformer) | 28.3 | MASS: Masked Sequence to Sequence Pre-training for Language Generation | |
GPT-3 175B (Few-Shot) | 29.7 | Language Models are Few-Shot Learners | |
MLM pretraining for encoder and decoder | 26.4 | Cross-lingual Language Model Pretraining | |
PBSMT + NMT | 20.2 | Phrase-Based & Neural Unsupervised Machine Translation | |
Synthetic bilingual data init | 20.0 | Unsupervised Neural Machine Translation Initialized by Unsupervised Statistical Machine Translation | - |
SMT + NMT (tuning and joint refinement) | 26.9 | An Effective Approach to Unsupervised Machine Translation |
0 of 7 row(s) selected.