HyperAI초신경

Semantic Textual Similarity On Sts16

평가 지표

Spearman Correlation

평가 결과

이 벤치마크에서 각 모델의 성능 결과

모델 이름
Spearman Correlation
Paper TitleRepository
PromptEOL+CSE+LLaMA-30B0.8627Scaling Sentence Embeddings with Large Language Models
DiffCSE-RoBERTa-base0.8212DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
Trans-Encoder-BERT-base-bi (unsup.)0.8305Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-RoBERTa-large-cross (unsup.)0.8503Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-large-bi (unsup.)0.8481Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
BERTlarge-flow (target)0.7763On the Sentence Embeddings from Pre-trained Language Models-
Trans-Encoder-RoBERTa-base-cross (unsup.)0.8377Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromptEOL+CSE+OPT-13B0.8590Scaling Sentence Embeddings with Large Language Models
AnglE-LLaMA-13B0.8700AnglE-optimized Text Embeddings
PromCSE-RoBERTa-large (0.355B)0.8496Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
IS-BERT-NLI0.7016An Unsupervised Sentence Embedding Method by Mutual Information Maximization
AnglE-LLaMA-7B0.8691AnglE-optimized Text Embeddings
Mirror-RoBERTa-base (unsup.)0.78Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
DiffCSE-BERT-base0.8054DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
PromptEOL+CSE+OPT-2.7B0.8591Scaling Sentence Embeddings with Large Language Models
Dino (STSb/̄0.7718Generating Datasets with Pretrained Language Models
SimCSE-RoBERTalarge0.8393SimCSE: Simple Contrastive Learning of Sentence Embeddings
AnglE-LLaMA-7B-v20.8700AnglE-optimized Text Embeddings
SRoBERTa-NLI-large0.7682Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Mirror-BERT-base (unsup.)0.743Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
0 of 20 row(s) selected.