HyperAI

Semantic Textual Similarity On Sts13

Metrics

Spearman Correlation

Results

Performance results of various models on this benchmark

Model Name
Spearman Correlation
Paper TitleRepository
SimCSE-RoBERTa-base0.8136SimCSE: Simple Contrastive Learning of Sentence Embeddings
DiffCSE-BERT-base0.8443DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
Trans-Encoder-BERT-base-cross (unsup.)0.8559Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromptEOL+CSE+LLaMA-30B0.9025Scaling Sentence Embeddings with Large Language Models
Trans-Encoder-RoBERTa-large-cross (unsup.)0.8831Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SBERT-NLI-large0.7846Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
PromptEOL+CSE+OPT-13B0.9024Scaling Sentence Embeddings with Large Language Models
Trans-Encoder-BERT-large-bi (unsup.)0.8851Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromCSE-RoBERTa-large (0.355B)0.8897Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
AnglE-LLaMA-7B-v20.9056AnglE-optimized Text Embeddings
SimCSE-RoBERTa-large0.8727SimCSE: Simple Contrastive Learning of Sentence Embeddings
BERTlarge-flow (target)0.7339On the Sentence Embeddings from Pre-trained Language Models-
Mirror-RoBERTa-base (unsup.)0.819Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Dino (STSb/̄0.8126Generating Datasets with Pretrained Language Models
DiffCSE-RoBERTa-base0.8343DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
AnglE-LLaMA-7B0.9058AnglE-optimized Text Embeddings
Trans-Encoder-BERT-large-cross (unsup.)0.8831Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Mirror-BERT-base (unsup.)0.796Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
SimCSE-BERT-base0.8241SimCSE: Simple Contrastive Learning of Sentence Embeddings
Trans-Encoder-BERT-base-bi (unsup.)0.851Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
0 of 22 row(s) selected.