HyperAI
HyperAI초신경
홈
플랫폼
문서
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
Command Palette
Search for a command to run...
홈
SOTA
질문 응답
Question Answering On Cnn Daily Mail
Question Answering On Cnn Daily Mail
평가 지표
CNN
Daily Mail
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
CNN
Daily Mail
Paper Title
Repository
GA+MAGE (32)
78.6
-
Linguistic Knowledge as Memory for Recurrent Neural Networks
-
GA Reader
77.9
80.9
Gated-Attention Readers for Text Comprehension
Attentive + relabling + ensemble
77.6
79.2
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
BiDAF
76.9
79.6
Bidirectional Attention Flow for Machine Comprehension
AIA
76.1
-
Iterative Alternating Neural Attention for Machine Reading
AS Reader (ensemble model)
75.4
77.7
Text Understanding with the Attention Sum Reader Network
ReasoNet
74.7
76.6
ReasoNet: Learning to Stop Reading in Machine Comprehension
-
AoA Reader
74.4
-
Attention-over-Attention Neural Networks for Reading Comprehension
EpiReader
74
-
Natural Language Comprehension with the EpiReader
-
Dynamic Entity Repres. + w2v
72.9
-
-
-
AttentiveReader + bilinear attention
72.4
75.8
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
AS Reader (single model)
69.5
73.9
Text Understanding with the Attention Sum Reader Network
MemNNs (ensemble)
69.4
-
Teaching Machines to Read and Comprehend
Classifier
67.9
68.3
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
Impatient Reader
63.8
68.0
Teaching Machines to Read and Comprehend
Attentive Reader
63
69
Teaching Machines to Read and Comprehend
0 of 16 row(s) selected.
Previous
Next