Clinical Concept Extraction On 2010 I2B2Va
評価指標
Exact Span F1
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
モデル名 | Exact Span F1 | Paper Title | Repository |
---|---|---|---|
BERTlarge (MIMIC) | 90.25 | Enhancing Clinical Concept Extraction with Contextual Embeddings | - |
ClinicalBERT | 87.4 | Cost-effective Selection of Pretraining Data: A Case Study of Pretraining BERT on Social Media | - |
ELMo (finetuned on i2b2) + word2vec (i2b2) | 86.23 | Embedding Strategies for Specialized Domains: Application to Clinical Entity Recognition | |
deBruijn et al. (System 1.1) | 85.23 | Machine-learned solutions for three stages of clinical information extraction: the state of the art at i2b2 2010 | - |
CharacterBERT (base, medical) | 89.24 | CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters |
0 of 5 row(s) selected.