Machine Translation On Wmt2015 English German
Metriken
BLEU score
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Vergleichstabelle
Modellname | BLEU score |
---|---|
unsupervised-neural-machine-translation | 6.89 |
neural-machine-translation-in-linear-time | 26.3 |
Modell 3 | 24.1 |
a-character-level-decoder-without-explicit | 23.5 |
neural-machine-translation-of-rare-words-with | 22.8 |
a-character-level-decoder-without-explicit | 21.7 |