HyperAI超神経
ホーム
ニュース
最新論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
ホーム
SOTA
Relation Extraction
Relation Extraction On Chemprot
Relation Extraction On Chemprot
評価指標
Micro F1
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Micro F1
Paper Title
Repository
CharacterBERT (base, medical)
73.44
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
BioM-BERT
-
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
SciBert (Finetune)
-
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT (Base Vocab)
-
SciBERT: A Pretrained Language Model for Scientific Text
ELECTRAMed
-
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
PubMedBERT uncased
77.24
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
SciFive Large
-
SciFive: a text-to-text transformer model for biomedical literature
BioLinkBERT (large)
79.98
LinkBERT: Pretraining Language Models with Document Links
KeBioLM
-
Improving Biomedical Pretrained Language Models with Knowledge
BioMegatron
-
BioMegatron: Larger Biomedical Domain Language Model
BioT5X (base)
-
SciFive: a text-to-text transformer model for biomedical literature
BioBERT
-
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
NCBI_BERT(large) (P)
-
-
-
0 of 13 row(s) selected.
Previous
Next