Command Palette
Search for a command to run...
Unsupervised Machine Translation On Wmt2016 1
평가 지표
BLEU
평가 결과
이 벤치마크에서 각 모델의 성능 결과
| Paper Title | ||
|---|---|---|
| GPT-3 175B (Few-Shot) | 40.6 | Language Models are Few-Shot Learners |
| MASS (6-layer Transformer) | 35.2 | MASS: Masked Sequence to Sequence Pre-training for Language Generation |
| SMT + NMT (tuning and joint refinement) | 34.4 | An Effective Approach to Unsupervised Machine Translation |
| MLM pretraining for encoder and decoder | 34.3 | Cross-lingual Language Model Pretraining |
| Synthetic bilingual data init | 26.7 | Unsupervised Neural Machine Translation Initialized by Unsupervised Statistical Machine Translation |
| SMT as posterior regularization | 26.3 | Unsupervised Neural Machine Translation with SMT as Posterior Regularization |
| PBSMT | 25.2 | Phrase-Based & Neural Unsupervised Machine Translation |
0 of 7 row(s) selected.