HyperAI
HyperAI
Accueil
Actualités
Articles de recherche
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Rechercher sur le site...
⌘
K
Accueil
SOTA
Réponse à Questions Ouvertes
Open Domain Question Answering On Searchqa
Open Domain Question Answering On Searchqa
Métriques
F1
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
F1
Paper Title
Repository
SpanBERT
84.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
-
Denoising QA
64.5
Denoising Distantly Supervised Open-Domain Question Answering
DecaProp
63.6
Densely Connected Attention Propagation for Reading Comprehension
-
DECAPROP
-
Densely Connected Attention Propagation for Reading Comprehension
-
Locality-Sensitive Hashing
-
Reformer: The Efficient Transformer
-
Sparse Attention
-
Generating Long Sequences with Sparse Transformers
-
Multi-passage BERT
-
Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering
-
Cluster-Former (#C=512)
-
Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding
-
Bi-Attention + DCU-LSTM
-
Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension
-
Focused Hierarchical RNN
-
Focused Hierarchical RNNs for Conditional Sequence Processing
-
AMANDA
-
A Question-Focused Multi-Factor Attention Network for Question Answering
-
ASR
-
Text Understanding with the Attention Sum Reader Network
-
R^3
55.3
R$^3$: Reinforced Reader-Ranker for Open-Domain Question Answering
-
DrQA
-
Reading Wikipedia to Answer Open-Domain Questions
-
0 of 14 row(s) selected.
Previous
Next
Open Domain Question Answering On Searchqa | SOTA | HyperAI