HyperAI
Startseite
Neuigkeiten
Neueste Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Deutsch
HyperAI
Toggle sidebar
Seite durchsuchen…
⌘
K
Startseite
SOTA
Semantic Textual Similarity
Semantic Textual Similarity On Sts12
Semantic Textual Similarity On Sts12
Metriken
Spearman Correlation
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
Spearman Correlation
Paper Title
Repository
PromptEOL+CSE+LLaMA-30B
0.7972
Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-13B
0.8020
Scaling Sentence Embeddings with Large Language Models
Dino (STSb/̄
0.7027
Generating Datasets with Pretrained Language Models
SRoBERTa-NLI-large
0.7453
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-BERT-large-bi (unsup.)
0.7819
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-RoBERTa-large-cross (unsup.)
0.7828
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTa-base
0.7016
SimCSE: Simple Contrastive Learning of Sentence Embeddings
AnglE-LLaMA-7B
0.7868
AnglE-optimized Text Embeddings
Mirror-BERT-base (unsup.)
0.674
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
DiffCSE-RoBERTa-base
0.7005
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
PromptEOL+CSE+OPT-2.7B
0.7949
Scaling Sentence Embeddings with Large Language Models
PromCSE-RoBERTa-large (0.355B)
0.7956
Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
DiffCSE-BERT-base
0.7228
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
AnglE-LLaMA-13B
0.7868
AnglE-optimized Text Embeddings
Trans-Encoder-RoBERTa-base-cross (unsup.)
0.7637
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-base-bi (unsup.)
0.7509
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTa-large
0.7746
SimCSE: Simple Contrastive Learning of Sentence Embeddings
Mirror-RoBERTa-base (unsup.)
0.648
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
IS-BERT-NLI
0.5677
An Unsupervised Sentence Embedding Method by Mutual Information Maximization
BERTlarge-flow (target)
0.6520
On the Sentence Embeddings from Pre-trained Language Models
-
0 of 20 row(s) selected.
Previous
Next