HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Sign Language Translation
Sign Language Translation On Rwth Phoenix
Sign Language Translation On Rwth Phoenix
Metrics
BLEU-4
Results
Performance results of various models on this benchmark
Columns
Model Name
BLEU-4
Paper Title
Repository
SignBERT+
25.7
SignBERT+: Hand-model-aware Self-supervised Pre-training for Sign Language Understanding
-
MSKA-SLT
29.03
Multi-Stream Keypoint Attention Network for Sign Language Recognition and Translation
BERT2BERT
21.26
Frozen Pretrained Transformers for Neural Sign Language Translation
BN-TIN-Transf.+SignBT
24.32
Improving Sign Language Translation with Monolingual Data by Sign Back-Translation
-
S2T Stochastic Transformer (Ens)
25.59
Stochastic Transformer Networks with Linear Competing Units: Application to end-to-end SL Translation
TwoStream-SLT
28.95
Two-Stream Network for Sign Language Recognition and Translation
BERT2RND
22.47
Frozen Pretrained Transformers for Neural Sign Language Translation
Signformer
23.43
Signformer is all you need: Towards Edge AI for Sign Language
BN-TIN-Transf.
21.68
Improving Sign Language Translation with Monolingual Data by Sign Back-Translation
-
Sign2Gloss2Text
19.26
Neural Sign Language Translation
STMC+Transformer (Ens)
25.40
Better Sign Language Translation with STMC-Transformer
0 of 11 row(s) selected.
Previous
Next