HyperAI超神経

Semantic Textual Similarity On Sick

評価指標

Spearman Correlation

評価結果

このベンチマークにおける各モデルのパフォーマンス結果

モデル名
Spearman Correlation
Paper TitleRepository
Trans-Encoder-BERT-large-bi (unsup.)0.7133Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Mirror-RoBERTa-base (unsup.)0.706Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
SRoBERTa-NLI-large0.7429Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-RoBERTa-large-cross (unsup.)0.7163Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTalarge0.8195SimCSE: Simple Contrastive Learning of Sentence Embeddings
SRoBERTa-NLI-base0.7446Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Dino (STSb/̄0.6809Generating Datasets with Pretrained Language Models
PromCSE-RoBERTa-large (0.355B)0.8243Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
Trans-Encoder-BERT-base-cross (unsup.)0.6952Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Rematch0.6772Rematch: Robust and Efficient Matching of Local Knowledge Graphs to Improve Structural and Semantic Similarity
SBERT-NLI-base0.7291Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SBERT-NLI-large0.7375Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
IS-BERT-NLI0.6425An Unsupervised Sentence Embedding Method by Mutual Information Maximization
Dino (STS/̄0.7426Generating Datasets with Pretrained Language Models
PromptEOL+CSE+OPT-13B0.8206Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-2.7B0.8129Scaling Sentence Embeddings with Large Language Models
Mirror-BERT-base (unsup.)0.703Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
BERTbase-flow (NLI)0.6544On the Sentence Embeddings from Pre-trained Language Models-
SentenceBERT0.7462Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-BERT-base-bi (unsup.)0.7276Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
0 of 22 row(s) selected.