HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
المنصة
الرئيسية
SOTA
تشابه النصوص الدلالي
Semantic Textual Similarity On Sick
Semantic Textual Similarity On Sick
المقاييس
Spearman Correlation
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
Spearman Correlation
Paper Title
PromCSE-RoBERTa-large (0.355B)
0.8243
Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
PromptEOL+CSE+LLaMA-30B
0.8238
Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-13B
0.8206
Scaling Sentence Embeddings with Large Language Models
SimCSE-RoBERTalarge
0.8195
SimCSE: Simple Contrastive Learning of Sentence Embeddings
PromptEOL+CSE+OPT-2.7B
0.8129
Scaling Sentence Embeddings with Large Language Models
SentenceBERT
0.7462
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SRoBERTa-NLI-base
0.7446
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SRoBERTa-NLI-large
0.7429
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Dino (STS/̄
0.7426
Generating Datasets with Pretrained Language Models
SBERT-NLI-large
0.7375
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SBERT-NLI-base
0.7291
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Trans-Encoder-BERT-base-bi (unsup.)
0.7276
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-large-cross (unsup.)
0.7192
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-RoBERTa-large-cross (unsup.)
0.7163
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-large-bi (unsup.)
0.7133
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Mirror-RoBERTa-base (unsup.)
0.706
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Mirror-BERT-base (unsup.)
0.703
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Trans-Encoder-BERT-base-cross (unsup.)
0.6952
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Dino (STSb/̄
0.6809
Generating Datasets with Pretrained Language Models
Rematch
0.6772
Rematch: Robust and Efficient Matching of Local Knowledge Graphs to Improve Structural and Semantic Similarity
0 of 22 row(s) selected.
Previous
Next
Semantic Textual Similarity On Sick | SOTA | HyperAI