HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Machine Translation
Machine Translation On Wmt2014 English French
Machine Translation On Wmt2014 English French
Métriques
BLEU score
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
BLEU score
Paper Title
Repository
CSLM + RNN + WP
34.54
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
LightConv
43.1
Pay Less Attention with Lightweight and Dynamic Convolutions
GRU+Attention
26.4
Can Active Memory Replace Attention?
Transformer Big
41.0
Attention Is All You Need
RNMT+
41.0
The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation
Deep-Att
35.9
Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation
Transformer Base
38.1
Attention Is All You Need
MoE
40.56
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
LSTM
34.8
Sequence to Sequence Learning with Neural Networks
RNN-search50*
36.2
Neural Machine Translation by Jointly Learning to Align and Translate
Transformer+BT (ADMIN init)
46.4
Very Deep Transformers for Neural Machine Translation
ResMLP-12
40.6
ResMLP: Feedforward networks for image classification with data-efficient training
Noisy back-translation
45.6
Understanding Back-Translation at Scale
Rfa-Gate-arccos
39.2
Random Feature Attention
-
Unsupervised PBSMT
28.11
Phrase-Based & Neural Unsupervised Machine Translation
TransformerBase + AutoDropout
40
AutoDropout: Learning Dropout Patterns to Regularize Deep Networks
ConvS2S (ensemble)
41.3
Convolutional Sequence to Sequence Learning
PBMT
37
-
-
Transformer (big) + Relative Position Representations
41.5
Self-Attention with Relative Position Representations
Unsupervised attentional encoder-decoder + BPE
14.36
Unsupervised Neural Machine Translation
0 of 57 row(s) selected.
Previous
Next