HyperAI
Startseite
Neuigkeiten
Neueste Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Deutsch
HyperAI
Toggle sidebar
Seite durchsuchen…
⌘
K
Startseite
SOTA
Zero Shot Learning
Zero Shot Learning On Medconceptsqa
Zero Shot Learning On Medconceptsqa
Metriken
Accuracy
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
Accuracy
Paper Title
Repository
gpt-4-0125-preview
52.489
GPT-4 Technical Report
yikuan8/Clinical-Longformer
25.040
Clinical-Longformer and Clinical-BigBird: Transformers for long clinical sequences
PharMolix/BioMedGPT-LM-7B
24.747
BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine
UFNLP/gatortron-medium
24.862
GatorTron: A Large Clinical Language Model to Unlock Patient Information from Unstructured Electronic Health Records
-
HuggingFaceH4/zephyr-7b-beta
25.538
Zephyr: Direct Distillation of LM Alignment
dmis-lab/meerkat-7b-v1.0
25.680
Small Language Models Learn Enhanced Reasoning Skills from Medical Textbooks
-
johnsnowlabs/JSL-MedMNX-7B
24.427
-
-
meta-llama/Meta-Llama-3-8B-Instruct
25.840
LLaMA: Open and Efficient Foundation Language Models
BioMistral/BioMistral-7B-DARE
24.569
BioMistral: A Collection of Open-Source Pretrained Large Language Models for Medical Domains
epfl-llm/meditron-70b
25.360
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
gpt-3.5-turbo
37.058
Language Models are Few-Shot Learners
epfl-llm/meditron-7b
25.751
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
dmis-lab/biobert-v1.1
26.151
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
0 of 13 row(s) selected.
Previous
Next