HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Relation Extraction
Relation Extraction On Chemprot
Relation Extraction On Chemprot
평가 지표
Micro F1
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Micro F1
Paper Title
Repository
CharacterBERT (base, medical)
73.44
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
BioM-BERT
-
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
SciBert (Finetune)
-
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT (Base Vocab)
-
SciBERT: A Pretrained Language Model for Scientific Text
ELECTRAMed
-
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
PubMedBERT uncased
77.24
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
SciFive Large
-
SciFive: a text-to-text transformer model for biomedical literature
BioLinkBERT (large)
79.98
LinkBERT: Pretraining Language Models with Document Links
KeBioLM
-
Improving Biomedical Pretrained Language Models with Knowledge
BioMegatron
-
BioMegatron: Larger Biomedical Domain Language Model
BioT5X (base)
-
SciFive: a text-to-text transformer model for biomedical literature
BioBERT
-
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
NCBI_BERT(large) (P)
-
-
-
0 of 13 row(s) selected.
Previous
Next