HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
Machine Translation
Machine Translation On Iwslt2015 English 1
Machine Translation On Iwslt2015 English 1
المقاييس
BLEU
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
BLEU
Paper Title
Repository
EnViT5 + MTet
40.2
MTet: Multi-domain Translation for English and Vietnamese
Self-Adaptive Control of Temperature
29.12
Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation
Tall Transformer with Style-Augmented Training
37.8
Better Translation for Vietnamese
SAWR
29.09
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
-
Transformer+BPE-dropout
33.27
BPE-Dropout: Simple and Effective Subword Regularization
LSTM+Attention+Ensemble
26.4
Stanford Neural Machine Translation Systems for Spoken Language Domains
Transformer+BPE+FixNorm+ScaleNorm
32.8
Transformers without Tears: Improving the Normalization of Self-Attention
NLLB-200
-
No Language Left Behind: Scaling Human-Centered Machine Translation
CVT
29.6
Semi-Supervised Sequence Modeling with Cross-View Training
Transformer+LayerNorm-simple
31.4
Understanding and Improving Layer Normalization
-
DeconvDec
28.47
Deconvolution-Based Global Decoding for Neural Machine Translation
0 of 11 row(s) selected.
Previous
Next