HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
Machine Translation
Machine Translation On Iwslt2015 German
Machine Translation On Iwslt2015 German
المقاييس
BLEU score
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
BLEU score
Paper Title
Repository
Word-level CNN w/attn, input feeding
24.0
Sequence-to-Sequence Learning as Beam-Search Optimization
Conv-LSTM (deep+pos)
30.4
A Convolutional Encoder Model for Neural Machine Translation
QRNN
19.41
Quasi-Recurrent Neural Networks
FlowSeq-base
24.75
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
Denoising autoencoders (non-autoregressive)
32.43
Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
Word-level LSTM w/attn
20.2
Sequence Level Training with Recurrent Neural Networks
ConvS2S
32.31
Convolutional Sequence to Sequence Learning
Transformer with FRAGE
33.97
FRAGE: Frequency-Agnostic Word Representation
ConvS2S+Risk
32.93
Classical Structured Prediction Losses for Sequence to Sequence Learning
RNNsearch
29.98
An Actor-Critic Algorithm for Sequence Prediction
Bi-GRU (MLE+SLE)
28.53
Neural Machine Translation by Jointly Learning to Align and Translate
Pervasive Attention
34.18
Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
NPMT + language model
30.08
Towards Neural Phrase-based Machine Translation
PS-KD
36.20
Self-Knowledge Distillation with Progressive Refinement of Targets
DCCL
29.56
Compressing Word Embeddings via Deep Compositional Code Learning
0 of 15 row(s) selected.
Previous
Next