HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Sign Language Translation
Sign Language Translation On Rwth Phoenix
Sign Language Translation On Rwth Phoenix
Métriques
BLEU-4
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
BLEU-4
Paper Title
Repository
SignBERT+
25.7
SignBERT+: Hand-model-aware Self-supervised Pre-training for Sign Language Understanding
-
MSKA-SLT
29.03
Multi-Stream Keypoint Attention Network for Sign Language Recognition and Translation
BERT2BERT
21.26
Frozen Pretrained Transformers for Neural Sign Language Translation
BN-TIN-Transf.+SignBT
24.32
Improving Sign Language Translation with Monolingual Data by Sign Back-Translation
-
S2T Stochastic Transformer (Ens)
25.59
Stochastic Transformer Networks with Linear Competing Units: Application to end-to-end SL Translation
TwoStream-SLT
28.95
Two-Stream Network for Sign Language Recognition and Translation
BERT2RND
22.47
Frozen Pretrained Transformers for Neural Sign Language Translation
Signformer
23.43
Signformer is all you need: Towards Edge AI for Sign Language
BN-TIN-Transf.
21.68
Improving Sign Language Translation with Monolingual Data by Sign Back-Translation
-
Sign2Gloss2Text
19.26
Neural Sign Language Translation
STMC+Transformer (Ens)
25.40
Better Sign Language Translation with STMC-Transformer
0 of 11 row(s) selected.
Previous
Next