HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
Notebooks
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
Console
تسجيل الدخول
الرئيسية
SOTA
الأسئلة المفتوحة الإجابة
Open Domain Question Answering On Searchqa
Open Domain Question Answering On Searchqa
المقاييس
F1
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار المرجعي
Columns
اسم الموديل
F1
عنوان الورقة
الشفرة
SpanBERT
84.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Denoising QA
64.5
Denoising Distantly Supervised Open-Domain Question Answering
DecaProp
63.6
Densely Connected Attention Propagation for Reading Comprehension
DECAPROP
-
Densely Connected Attention Propagation for Reading Comprehension
Locality-Sensitive Hashing
-
Reformer: The Efficient Transformer
Sparse Attention
-
Generating Long Sequences with Sparse Transformers
Multi-passage BERT
-
Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering
Cluster-Former (#C=512)
-
Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding
Bi-Attention + DCU-LSTM
-
Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension
Focused Hierarchical RNN
-
Focused Hierarchical RNNs for Conditional Sequence Processing
AMANDA
-
A Question-Focused Multi-Factor Attention Network for Question Answering
ASR
-
Text Understanding with the Attention Sum Reader Network
R^3
55.3
R$^3$: Reinforced Reader-Ranker for Open-Domain Question Answering
DrQA
-
Reading Wikipedia to Answer Open-Domain Questions
0 of 14 row(s) selected.
Previous
Next
HyperAI
HyperAI
الرئيسية
المنصة
الوثائق
الأخبار
الأوراق البحثية
Notebooks
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
شروط الخدمة
سياسة الخصوصية
العربية
HyperAI
HyperAI
Toggle Sidebar
البحث في الموقع...
⌘
K
Command Palette
Search for a command to run...
Console
تسجيل الدخول
الرئيسية
SOTA
الأسئلة المفتوحة الإجابة
Open Domain Question Answering On Searchqa
Open Domain Question Answering On Searchqa
المقاييس
F1
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار المرجعي
Columns
اسم الموديل
F1
عنوان الورقة
الشفرة
SpanBERT
84.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Denoising QA
64.5
Denoising Distantly Supervised Open-Domain Question Answering
DecaProp
63.6
Densely Connected Attention Propagation for Reading Comprehension
DECAPROP
-
Densely Connected Attention Propagation for Reading Comprehension
Locality-Sensitive Hashing
-
Reformer: The Efficient Transformer
Sparse Attention
-
Generating Long Sequences with Sparse Transformers
Multi-passage BERT
-
Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering
Cluster-Former (#C=512)
-
Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding
Bi-Attention + DCU-LSTM
-
Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension
Focused Hierarchical RNN
-
Focused Hierarchical RNNs for Conditional Sequence Processing
AMANDA
-
A Question-Focused Multi-Factor Attention Network for Question Answering
ASR
-
Text Understanding with the Attention Sum Reader Network
R^3
55.3
R$^3$: Reinforced Reader-Ranker for Open-Domain Question Answering
DrQA
-
Reading Wikipedia to Answer Open-Domain Questions
0 of 14 row(s) selected.
Previous
Next
Console
Console