HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
시스템 설정
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Semantic Textual Similarity
Semantic Textual Similarity On Sick
Semantic Textual Similarity On Sick
평가 지표
Spearman Correlation
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Spearman Correlation
Paper Title
Repository
Trans-Encoder-BERT-large-bi (unsup.)
0.7133
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Mirror-RoBERTa-base (unsup.)
0.706
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
SRoBERTa-NLI-large
0.7429
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-RoBERTa-large-cross (unsup.)
0.7163
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTalarge
0.8195
SimCSE: Simple Contrastive Learning of Sentence Embeddings
SRoBERTa-NLI-base
0.7446
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Dino (STSb/̄
0.6809
Generating Datasets with Pretrained Language Models
PromCSE-RoBERTa-large (0.355B)
0.8243
Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
Trans-Encoder-BERT-base-cross (unsup.)
0.6952
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Rematch
0.6772
Rematch: Robust and Efficient Matching of Local Knowledge Graphs to Improve Structural and Semantic Similarity
SBERT-NLI-base
0.7291
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SBERT-NLI-large
0.7375
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
IS-BERT-NLI
0.6425
An Unsupervised Sentence Embedding Method by Mutual Information Maximization
Dino (STS/̄
0.7426
Generating Datasets with Pretrained Language Models
PromptEOL+CSE+OPT-13B
0.8206
Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-2.7B
0.8129
Scaling Sentence Embeddings with Large Language Models
Mirror-BERT-base (unsup.)
0.703
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
BERTbase-flow (NLI)
0.6544
On the Sentence Embeddings from Pre-trained Language Models
-
SentenceBERT
0.7462
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-BERT-base-bi (unsup.)
0.7276
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
0 of 22 row(s) selected.
Previous
Next