HyperAI
HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
الإجابة على الأسئلة
Question Answering On Babi
Question Answering On Babi
المقاييس
Mean Error Rate
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
Mean Error Rate
Paper Title
Repository
ReMO
1.2%
Finding ReMO (Related Memory Object): A Simple Neural Architecture for Text based Reasoning
-
DMN+
-
Dynamic Neural Turing Machine with Soft and Hard Addressing Schemes
-
EntNet
9.7%
Tracking the World State with Recurrent Entity Networks
RR
0.46%
Recurrent Relational Networks
SDNC
6.4%
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes
-
STM
0.39%
Self-Attentive Associative Memory
NUTM
5.6%
Neural Stored-program Memory
RUM
-
Rotational Unit of Memory
GORU
-
Gated Orthogonal Recurrent Units: On Learning to Forget
End-To-End Memory Networks
7.5%
End-To-End Memory Networks
H-Mem
-
H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks
ours
-
Memory-enriched computation and learning in spiking neural networks through Hebbian plasticity
LSTM
28.7%
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes
-
QRN
0.3%
Query-Reduction Networks for Question Answering
0 of 14 row(s) selected.
Previous
Next
Question Answering On Babi | SOTA | HyperAI