HyperAI
HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Similarité textuelle sémantique
Semantic Textual Similarity On Sick
Semantic Textual Similarity On Sick
Métriques
Spearman Correlation
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
Spearman Correlation
Paper Title
Repository
Trans-Encoder-BERT-large-bi (unsup.)
0.7133
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
-
Mirror-RoBERTa-base (unsup.)
0.706
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
-
SRoBERTa-NLI-large
0.7429
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
-
Trans-Encoder-RoBERTa-large-cross (unsup.)
0.7163
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
-
SimCSE-RoBERTalarge
0.8195
SimCSE: Simple Contrastive Learning of Sentence Embeddings
-
SRoBERTa-NLI-base
0.7446
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
-
Dino (STSb/̄
0.6809
Generating Datasets with Pretrained Language Models
-
PromCSE-RoBERTa-large (0.355B)
0.8243
Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
-
Trans-Encoder-BERT-base-cross (unsup.)
0.6952
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
-
Rematch
0.6772
Rematch: Robust and Efficient Matching of Local Knowledge Graphs to Improve Structural and Semantic Similarity
-
SBERT-NLI-base
0.7291
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
-
SBERT-NLI-large
0.7375
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
-
IS-BERT-NLI
0.6425
An Unsupervised Sentence Embedding Method by Mutual Information Maximization
-
Dino (STS/̄
0.7426
Generating Datasets with Pretrained Language Models
-
PromptEOL+CSE+OPT-13B
0.8206
Scaling Sentence Embeddings with Large Language Models
-
PromptEOL+CSE+OPT-2.7B
0.8129
Scaling Sentence Embeddings with Large Language Models
-
Mirror-BERT-base (unsup.)
0.703
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
-
BERTbase-flow (NLI)
0.6544
On the Sentence Embeddings from Pre-trained Language Models
-
SentenceBERT
0.7462
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
-
Trans-Encoder-BERT-base-bi (unsup.)
0.7276
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
-
0 of 22 row(s) selected.
Previous
Next
Semantic Textual Similarity On Sick | SOTA | HyperAI