HyperAI
HyperAI초신경
홈
플랫폼
문서
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
Command Palette
Search for a command to run...
홈
SOTA
기계 번역
Machine Translation On Iwslt2015 German
Machine Translation On Iwslt2015 German
평가 지표
BLEU score
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
BLEU score
Paper Title
Repository
PS-KD
36.20
Self-Knowledge Distillation with Progressive Refinement of Targets
Pervasive Attention
34.18
Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
Transformer with FRAGE
33.97
FRAGE: Frequency-Agnostic Word Representation
ConvS2S+Risk
32.93
Classical Structured Prediction Losses for Sequence to Sequence Learning
Denoising autoencoders (non-autoregressive)
32.43
Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
ConvS2S
32.31
Convolutional Sequence to Sequence Learning
Conv-LSTM (deep+pos)
30.4
A Convolutional Encoder Model for Neural Machine Translation
NPMT + language model
30.08
Towards Neural Phrase-based Machine Translation
RNNsearch
29.98
An Actor-Critic Algorithm for Sequence Prediction
DCCL
29.56
Compressing Word Embeddings via Deep Compositional Code Learning
Bi-GRU (MLE+SLE)
28.53
Neural Machine Translation by Jointly Learning to Align and Translate
FlowSeq-base
24.75
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
Word-level CNN w/attn, input feeding
24.0
Sequence-to-Sequence Learning as Beam-Search Optimization
Word-level LSTM w/attn
20.2
Sequence Level Training with Recurrent Neural Networks
QRNN
19.41
Quasi-Recurrent Neural Networks
0 of 15 row(s) selected.
Previous
Next