HyperAI
HyperAI
Accueil
Actualités
Articles de recherche
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Rechercher sur le site...
⌘
K
Accueil
SOTA
Extraction de relations
Relation Extraction On Chemprot
Relation Extraction On Chemprot
Métriques
Micro F1
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
Micro F1
Paper Title
Repository
CharacterBERT (base, medical)
73.44
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
BioM-BERT
-
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
-
SciBert (Finetune)
-
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT (Base Vocab)
-
SciBERT: A Pretrained Language Model for Scientific Text
ELECTRAMed
-
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
PubMedBERT uncased
77.24
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
SciFive Large
-
SciFive: a text-to-text transformer model for biomedical literature
BioLinkBERT (large)
79.98
LinkBERT: Pretraining Language Models with Document Links
KeBioLM
-
Improving Biomedical Pretrained Language Models with Knowledge
BioMegatron
-
BioMegatron: Larger Biomedical Domain Language Model
BioT5X (base)
-
SciFive: a text-to-text transformer model for biomedical literature
BioBERT
-
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
NCBI_BERT(large) (P)
-
Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets
0 of 13 row(s) selected.
Previous
Next