HyperAI
Home
News
Latest Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
English
HyperAI
Toggle sidebar
Search the site…
⌘
K
Home
SOTA
Text Summarization
Text Summarization On Mteb
Text Summarization On Mteb
Metrics
Spearman Correlation
Results
Performance results of various models on this benchmark
Columns
Model Name
Spearman Correlation
Paper Title
Repository
LASER2
26.8
MTEB: Massive Text Embedding Benchmark
Contriever
30.36
MTEB: Massive Text Embedding Benchmark
GTR-XL
30.21
MTEB: Massive Text Embedding Benchmark
ST5-Large
29.64
MTEB: Massive Text Embedding Benchmark
ST5-Base
31.39
MTEB: Massive Text Embedding Benchmark
Glove
28.87
MTEB: Massive Text Embedding Benchmark
MPNet-multilingual
31.57
MTEB: Massive Text Embedding Benchmark
Komninos
30.49
MTEB: Massive Text Embedding Benchmark
SGPT-BLOOM-7.1B-msmarco
24.99
MTEB: Massive Text Embedding Benchmark
GTR-Base
29.67
MTEB: Massive Text Embedding Benchmark
MiniLM-L6
30.81
MTEB: Massive Text Embedding Benchmark
SimCSE-BERT-unsup
31.15
MTEB: Massive Text Embedding Benchmark
SimCSE-BERT-sup
23.31
MTEB: Massive Text Embedding Benchmark
GTR-XXL
30.64
MTEB: Massive Text Embedding Benchmark
MiniLM-L12
27.9
MTEB: Massive Text Embedding Benchmark
coCondenser-msmarco
29.5
MTEB: Massive Text Embedding Benchmark
SGPT-5.8B-msmarco
24.75
MTEB: Massive Text Embedding Benchmark
SGPT-125M-nli
30.26
MTEB: Massive Text Embedding Benchmark
MPNet
27.49
MTEB: Massive Text Embedding Benchmark
ST5-XL
29.91
MTEB: Massive Text Embedding Benchmark
0 of 26 row(s) selected.
Previous
Next