HyperAI
HyperAI초신경
홈
플랫폼
문서
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
Command Palette
Search for a command to run...
홈
SOTA
기계 번역
Machine Translation On Wmt2014 English French
Machine Translation On Wmt2014 English French
평가 지표
BLEU score
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
BLEU score
Paper Title
Repository
Transformer+BT (ADMIN init)
46.4
Very Deep Transformers for Neural Machine Translation
Noisy back-translation
45.6
Understanding Back-Translation at Scale
mRASP+Fine-Tune
44.3
Pre-training Multilingual Neural Machine Translation by Leveraging Alignment Information
Transformer + R-Drop
43.95
R-Drop: Regularized Dropout for Neural Networks
Admin
43.8
Understanding the Difficulty of Training Transformers
Transformer (ADMIN init)
43.8
Very Deep Transformers for Neural Machine Translation
BERT-fused NMT
43.78
Incorporating BERT into Neural Machine Translation
MUSE(Paralllel Multi-scale Attention)
43.5
MUSE: Parallel Multi-Scale Attention for Sequence to Sequence Learning
T5
43.4
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
Local Joint Self-attention
43.3
Joint Source-Target Self Attention with Locality Constraints
Depth Growing
43.27
Depth Growing for Neural Machine Translation
DynamicConv
43.2
Pay Less Attention with Lightweight and Dynamic Convolutions
Transformer Big
43.2
Scaling Neural Machine Translation
TaLK Convolutions
43.2
Time-aware Large Kernel Convolutions
LightConv
43.1
Pay Less Attention with Lightweight and Dynamic Convolutions
FLOATER-large
42.7
Learning to Encode Position for Transformer with Continuous Dynamical Model
OmniNetP
42.6
OmniNet: Omnidirectional Representations from Transformers
T2R + Pretrain
42.1
Finetuning Pretrained Transformers into RNNs
Transformer Big + MoS
42.1
Fast and Simple Mixture of Softmaxes with BPE and Hybrid-LightRNN for Language Generation
Synthesizer (Random + Vanilla)
41.85
Synthesizer: Rethinking Self-Attention in Transformer Models
0 of 57 row(s) selected.
Previous
Next