HyperAI
HyperAI超神経
ホーム
プラットフォーム
ドキュメント
ニュース
論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
Command Palette
Search for a command to run...
ホーム
SOTA
質問応答
Question Answering On Babi
Question Answering On Babi
評価指標
Mean Error Rate
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Mean Error Rate
Paper Title
Repository
LSTM
28.7%
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes
-
EntNet
9.7%
Tracking the World State with Recurrent Entity Networks
End-To-End Memory Networks
7.5%
End-To-End Memory Networks
SDNC
6.4%
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes
-
NUTM
5.6%
Neural Stored-program Memory
ReMO
1.2%
Finding ReMO (Related Memory Object): A Simple Neural Architecture for Text based Reasoning
-
RR
0.46%
Recurrent Relational Networks
STM
0.39%
Self-Attentive Associative Memory
QRN
0.3%
Query-Reduction Networks for Question Answering
DMN+
-
Dynamic Neural Turing Machine with Soft and Hard Addressing Schemes
-
RUM
-
Rotational Unit of Memory
GORU
-
Gated Orthogonal Recurrent Units: On Learning to Forget
H-Mem
-
H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks
-
ours
-
Memory-enriched computation and learning in spiking neural networks through Hebbian plasticity
0 of 14 row(s) selected.
Previous
Next