HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
Few Shot Learning
Few Shot Learning On Medconceptsqa
Few Shot Learning On Medconceptsqa
المقاييس
Accuracy
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
Accuracy
Paper Title
Repository
gpt-4-0125-preview
61.911
GPT-4 Technical Report
epfl-llm/meditron-70b
25.262
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
dmis-lab/biobert-v1.1
25.458
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
johnsnowlabs/JSL-MedMNX-7B
25.627
MedConceptsQA: Open Source Medical Concepts QA Benchmark
dmis-lab/meerkat-7b-v1.0
24.942
Small Language Models Learn Enhanced Reasoning Skills from Medical Textbooks
-
HuggingFaceH4/zephyr-7b-beta
25.058
Zephyr: Direct Distillation of LM Alignment
yikuan8/Clinical-Longformer
25.547
Clinical-Longformer and Clinical-BigBird: Transformers for long clinical sequences
gpt-3.5-turbo
41.476
Language Models are Few-Shot Learners
meta-llama/Meta-Llama-3-8B-Instruct
25.653
LLaMA: Open and Efficient Foundation Language Models
BioMistral/BioMistral-7B-DARE
25.058
BioMistral: A Collection of Open-Source Pretrained Large Language Models for Medical Domains
epfl-llm/meditron-7b
23.787
MEDITRON-70B: Scaling Medical Pretraining for Large Language Models
PharMolix/BioMedGPT-LM-7B
24.924
BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine
0 of 12 row(s) selected.
Previous
Next