HyperAI
HyperAI초신경
홈
플랫폼
문서
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
Command Palette
Search for a command to run...
홈
SOTA
기계 번역
Machine Translation On Iwslt2015 English 1
Machine Translation On Iwslt2015 English 1
평가 지표
BLEU
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
BLEU
Paper Title
Repository
EnViT5 + MTet
40.2
MTet: Multi-domain Translation for English and Vietnamese
Tall Transformer with Style-Augmented Training
37.8
Better Translation for Vietnamese
-
Transformer+BPE-dropout
33.27
BPE-Dropout: Simple and Effective Subword Regularization
Transformer+BPE+FixNorm+ScaleNorm
32.8
Transformers without Tears: Improving the Normalization of Self-Attention
Transformer+LayerNorm-simple
31.4
Understanding and Improving Layer Normalization
CVT
29.6
Semi-Supervised Sequence Modeling with Cross-View Training
Self-Adaptive Control of Temperature
29.12
Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation
SAWR
29.09
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
-
DeconvDec
28.47
Deconvolution-Based Global Decoding for Neural Machine Translation
LSTM+Attention+Ensemble
26.4
Stanford Neural Machine Translation Systems for Spoken Language Domains
-
NLLB-200
-
No Language Left Behind: Scaling Human-Centered Machine Translation
0 of 11 row(s) selected.
Previous
Next