HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
Link Prediction
Link Prediction On Wikidata5M
Link Prediction On Wikidata5M
المقاييس
Hits@1
Hits@10
Hits@3
MRR
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
Hits@1
Hits@10
Hits@3
MRR
Paper Title
Repository
DistMult
0.208
0.334
0.278
0.253
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
SimplE
0.252
0.377
0.317
0.296
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
KGT5-context + Description
0.406
0.46
0.44
0.426
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
MoCoKGC
0.435
0.591
0.517
0.490
MoCoKGC: Momentum Contrast Entity Encoding for Knowledge Graph Completion
-
KGT5 ComplEx Ensemble
0.282
0.426
0.362
0.336
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
ComplEx
0.228
0.373
0.310
0.281
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
KGT5 + Description
0.357
0.422
0.397
0.381
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
TransE
0.17
0.392
0.311
0.253
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
KGT5-context
0.35
0.427
0.396
0.378
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
KGT5
0.267
0.365
0.318
0.300
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
ComplEx
0.255
0.398
-
0.308
Parallel Training of Knowledge Graph Embedding Models: A Comparison of Techniques
RotatE
0.234
0.39
0.322
0.29
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
KEPLER-Wiki-rel
0.173
0.277
0.224
0.210
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
SimKGC + Description
0.313
0.441
0.376
0.358
SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models
0 of 14 row(s) selected.
Previous
Next