HyperAI超神経
ホーム
ニュース
最新論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
ホーム
SOTA
Open Domain Question Answering
Open Domain Question Answering On Searchqa
Open Domain Question Answering On Searchqa
評価指標
F1
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
F1
Paper Title
Repository
SpanBERT
84.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Denoising QA
64.5
Denoising Distantly Supervised Open-Domain Question Answering
DecaProp
63.6
Densely Connected Attention Propagation for Reading Comprehension
DECAPROP
-
Densely Connected Attention Propagation for Reading Comprehension
Locality-Sensitive Hashing
-
Reformer: The Efficient Transformer
Sparse Attention
-
Generating Long Sequences with Sparse Transformers
Multi-passage BERT
-
Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering
-
Cluster-Former (#C=512)
-
Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding
-
Bi-Attention + DCU-LSTM
-
Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension
-
Focused Hierarchical RNN
-
Focused Hierarchical RNNs for Conditional Sequence Processing
-
AMANDA
-
A Question-Focused Multi-Factor Attention Network for Question Answering
ASR
-
Text Understanding with the Attention Sum Reader Network
R^3
55.3
R$^3$: Reinforced Reader-Ranker for Open-Domain Question Answering
DrQA
-
Reading Wikipedia to Answer Open-Domain Questions
0 of 14 row(s) selected.
Previous
Next