HyperAI超神经

Semantic Textual Similarity On Sts14

评估指标

Spearman Correlation

评测结果

各个模型在此基准测试上的表现结果

模型名称
Spearman Correlation
Paper TitleRepository
DiffCSE-BERT-base0.7647DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
AnglE-LLaMA-7B-v20.8579AnglE-optimized Text Embeddings
SimCSE-RoBERTalarge0.8236SimCSE: Simple Contrastive Learning of Sentence Embeddings
BERTlarge-flow (target)0.6942On the Sentence Embeddings from Pre-trained Language Models-
PromptEOL+CSE+OPT-13B0.8534Scaling Sentence Embeddings with Large Language Models
IS-BERT-NLI0.6121An Unsupervised Sentence Embedding Method by Mutual Information Maximization
Trans-Encoder-BERT-large-bi (unsup.)0.8137Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
AnglE-LLaMA-13B0.8689AnglE-optimized Text Embeddings
Dino (STSb/̄0.7125Generating Datasets with Pretrained Language Models
Mirror-BERT-base (unsup.)0.713Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Mirror-RoBERTa-base (unsup.)0.732Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Trans-Encoder-RoBERTa-large-bi (unsup.)0.8176Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
DiffCSE-RoBERTa-base0.7549DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
Trans-Encoder-RoBERTa-large-cross (unsup.)0.8194Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SBERT-NLI-large0.7490000000000001Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-RoBERTa-base-cross (unsup.)0.7903Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromCSE-RoBERTa-large (0.355B)0.8381Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
Trans-Encoder-BERT-base-bi (unsup.)0.779Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromptEOL+CSE+OPT-2.7B0.8480Scaling Sentence Embeddings with Large Language Models
AnglE-LLaMA-7B0.8549AnglE-optimized Text Embeddings
0 of 21 row(s) selected.