Machine Translation On Wmt2016 German English
Métriques
BLEU score
Résultats
Résultats de performance de divers modèles sur ce benchmark
Nom du modèle | BLEU score | Paper Title | Repository |
---|---|---|---|
Linguistic Input Features | 32.9 | Linguistic Input Features Improve Neural Machine Translation | |
FLAN 137B (zero-shot) | 38.9 | Finetuned Language Models Are Zero-Shot Learners | |
Exploiting Mono at Scale (single) | - | Exploiting Monolingual Data at Scale for Neural Machine Translation | - |
Attentional encoder-decoder + BPE | 38.6 | Edinburgh Neural Machine Translation Systems for WMT 16 | |
Unsupervised NMT + weight-sharing | 14.62 | Unsupervised Neural Machine Translation with Weight Sharing | |
Unsupervised S2S with attention | 13.33 | Unsupervised Machine Translation Using Monolingual Corpora Only | |
SMT + iterative backtranslation (unsupervised) | 23.05 | Unsupervised Statistical Machine Translation | |
FLAN 137B (few-shot, k=11) | 40.7 | Finetuned Language Models Are Zero-Shot Learners |
0 of 8 row(s) selected.