HyperAI
HyperAI超神経
ホーム
プラットフォーム
ドキュメント
ニュース
論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
Command Palette
Search for a command to run...
ホーム
SOTA
質問応答
Question Answering On Quora Question Pairs
Question Answering On Quora Question Pairs
評価指標
Accuracy
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Accuracy
Paper Title
Repository
DeBERTa (large)
92.3%
DeBERTa: Decoding-enhanced BERT with Disentangled Attention
XLNet (single model)
92.3%
XLNet: Generalized Autoregressive Pretraining for Language Understanding
ALBERT
90.5%
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
T5-11B
90.4%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
MLM+ subs+ del-span
90.3%
CLEAR: Contrastive Learning for Sentence Representation
-
RoBERTa (ensemble)
90.2%
RoBERTa: A Robustly Optimized BERT Pretraining Approach
ELECTRA
90.1%
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
ERNIE 2.0 Large
90.1%
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
T5-Large 770M
89.9%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
ERNIE 2.0 Base
89.8%
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
T5-3B
89.7%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
T5-Base
89.4%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
RE2
89.2 %
Simple and Effective Text Matching with Richer Alignment Features
DistilBERT 66M
89.2%
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
BigBird
88.6%
Big Bird: Transformers for Longer Sequences
T5-Small
88.0%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
SWEM-concat
83.03%
Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms
SqueezeBERT
80.3%
SqueezeBERT: What can computer vision teach NLP about efficient neural networks?
24hBERT
70.7
How to Train BERT with an Academic Budget
0 of 19 row(s) selected.
Previous
Next