HyperAI

Semantic Textual Similarity On Sts12

Metriken

Spearman Correlation

Ergebnisse

Leistungsergebnisse verschiedener Modelle zu diesem Benchmark

Modellname
Spearman Correlation
Paper TitleRepository
PromptEOL+CSE+LLaMA-30B0.7972Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-13B0.8020Scaling Sentence Embeddings with Large Language Models
Dino (STSb/̄0.7027Generating Datasets with Pretrained Language Models
SRoBERTa-NLI-large0.7453Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-BERT-large-bi (unsup.)0.7819Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-RoBERTa-large-cross (unsup.)0.7828Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTa-base0.7016SimCSE: Simple Contrastive Learning of Sentence Embeddings
AnglE-LLaMA-7B0.7868AnglE-optimized Text Embeddings
Mirror-BERT-base (unsup.)0.674Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
DiffCSE-RoBERTa-base0.7005DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
PromptEOL+CSE+OPT-2.7B0.7949Scaling Sentence Embeddings with Large Language Models
PromCSE-RoBERTa-large (0.355B)0.7956Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
DiffCSE-BERT-base0.7228DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
AnglE-LLaMA-13B0.7868AnglE-optimized Text Embeddings
Trans-Encoder-RoBERTa-base-cross (unsup.)0.7637Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-base-bi (unsup.)0.7509Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTa-large0.7746SimCSE: Simple Contrastive Learning of Sentence Embeddings
Mirror-RoBERTa-base (unsup.)0.648Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
IS-BERT-NLI0.5677An Unsupervised Sentence Embedding Method by Mutual Information Maximization
BERTlarge-flow (target)0.6520On the Sentence Embeddings from Pre-trained Language Models-
0 of 20 row(s) selected.