Olympiadbench
Metrics
average
llm_model
maths_avg.
maths_en_comp
maths_zh_cee
maths_zh_comp
model_url
organization
parameters
physics_avg.
physics_en_comp
physics_zh_cee
release_date
updated_time
Results
Performance results of various models on this benchmark
Comparison Table
Model Name | average | llm_model | maths_avg. | maths_en_comp | maths_zh_cee | maths_zh_comp | model_url | organization | parameters | physics_avg. | physics_en_comp | physics_zh_cee | release_date | updated_time |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Model 1 | 3.65 | LLaVA-NeXT-34B | 4.3 | 3.98 | 4.64 | 2.6 | https://github.com/LLaVA-VL/LLaVA-NeXT | 34B | 2.08 | 1.36 | 2.32 | 2024.1.30 | 2024.6.6 |