HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Machine Translation
Machine Translation On Iwslt2015 English 1
Machine Translation On Iwslt2015 English 1
Métriques
BLEU
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
BLEU
Paper Title
Repository
EnViT5 + MTet
40.2
MTet: Multi-domain Translation for English and Vietnamese
Self-Adaptive Control of Temperature
29.12
Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation
Tall Transformer with Style-Augmented Training
37.8
Better Translation for Vietnamese
SAWR
29.09
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
-
Transformer+BPE-dropout
33.27
BPE-Dropout: Simple and Effective Subword Regularization
LSTM+Attention+Ensemble
26.4
Stanford Neural Machine Translation Systems for Spoken Language Domains
Transformer+BPE+FixNorm+ScaleNorm
32.8
Transformers without Tears: Improving the Normalization of Self-Attention
NLLB-200
-
No Language Left Behind: Scaling Human-Centered Machine Translation
CVT
29.6
Semi-Supervised Sequence Modeling with Cross-View Training
Transformer+LayerNorm-simple
31.4
Understanding and Improving Layer Normalization
-
DeconvDec
28.47
Deconvolution-Based Global Decoding for Neural Machine Translation
0 of 11 row(s) selected.
Previous
Next