HyperAI
HyperAI
Startseite
Neuigkeiten
Neueste Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Deutsch
HyperAI
HyperAI
Toggle sidebar
Seite durchsuchen…
⌘
K
Startseite
SOTA
Few-Shot-Lernen
Few Shot Learning On Medconceptsqa
Few Shot Learning On Medconceptsqa
Metriken
Accuracy
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
Accuracy
Paper Title
Repository
gpt-4-0125-preview
61.911
GPT-4 Technical Report
epfl-llm/meditron-70b
25.262
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
dmis-lab/biobert-v1.1
25.458
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
johnsnowlabs/JSL-MedMNX-7B
25.627
MedConceptsQA: Open Source Medical Concepts QA Benchmark
dmis-lab/meerkat-7b-v1.0
24.942
Small Language Models Learn Enhanced Reasoning Skills from Medical Textbooks
-
HuggingFaceH4/zephyr-7b-beta
25.058
Zephyr: Direct Distillation of LM Alignment
yikuan8/Clinical-Longformer
25.547
Clinical-Longformer and Clinical-BigBird: Transformers for long clinical sequences
gpt-3.5-turbo
41.476
Language Models are Few-Shot Learners
meta-llama/Meta-Llama-3-8B-Instruct
25.653
LLaMA: Open and Efficient Foundation Language Models
BioMistral/BioMistral-7B-DARE
25.058
BioMistral: A Collection of Open-Source Pretrained Large Language Models for Medical Domains
epfl-llm/meditron-7b
23.787
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
PharMolix/BioMedGPT-LM-7B
24.924
BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine
0 of 12 row(s) selected.
Previous
Next
Few Shot Learning On Medconceptsqa | SOTA | HyperAI