Machine Translation On Wmt2016 English
Metrics
BLEU score
Results
Performance results of various models on this benchmark
Model Name | BLEU score | Paper Title | Repository |
---|---|---|---|
Unsupervised PBSMT | 13.37 | Phrase-Based & Neural Unsupervised Machine Translation | |
PBSMT + NMT | 13.76 | Phrase-Based & Neural Unsupervised Machine Translation | |
Unsupervised NMT + Transformer | 7.98 | Phrase-Based & Neural Unsupervised Machine Translation | |
Attentional encoder-decoder + BPE | 26.0 | Edinburgh Neural Machine Translation Systems for WMT 16 |
0 of 4 row(s) selected.