HyperAI
HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
질문 응답
Question Answering On Quora Question Pairs
Question Answering On Quora Question Pairs
평가 지표
Accuracy
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Accuracy
Paper Title
Repository
T5-11B
90.4%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
24hBERT
70.7
How to Train BERT with an Academic Budget
MLM+ subs+ del-span
90.3%
CLEAR: Contrastive Learning for Sentence Representation
-
ELECTRA
90.1%
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
RoBERTa (ensemble)
90.2%
RoBERTa: A Robustly Optimized BERT Pretraining Approach
T5-Small
88.0%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
ERNIE 2.0 Large
90.1%
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
BigBird
88.6%
Big Bird: Transformers for Longer Sequences
T5-Base
89.4%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
RE2
89.2 %
Simple and Effective Text Matching with Richer Alignment Features
SqueezeBERT
80.3%
SqueezeBERT: What can computer vision teach NLP about efficient neural networks?
DeBERTa (large)
92.3%
DeBERTa: Decoding-enhanced BERT with Disentangled Attention
ALBERT
90.5%
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
XLNet (single model)
92.3%
XLNet: Generalized Autoregressive Pretraining for Language Understanding
SWEM-concat
83.03%
Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms
T5-3B
89.7%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
T5-Large 770M
89.9%
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
DistilBERT 66M
89.2%
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
ERNIE 2.0 Base
89.8%
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
0 of 19 row(s) selected.
Previous
Next
Question Answering On Quora Question Pairs | SOTA | HyperAI초신경