HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Linguistic Acceptability
Linguistic Acceptability On Cola
Linguistic Acceptability On Cola
평가 지표
Accuracy
MCC
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Accuracy
MCC
Paper Title
Repository
BERT+TDA
88.2%
0.726
Can BERT eat RuCoLA? Topological Data Analysis to Explain
RoBERTa (ensemble)
67.8%
-
RoBERTa: A Robustly Optimized BERT Pretraining Approach
T5-Base
51.1%
-
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
LTG-BERT-base 98M
82.7
-
Not all layers are equally as important: Every Layer Counts BERT
-
En-BERT + TDA
82.1%
0.565
Acceptability Judgements via Examining the Topology of Attention Maps
RemBERT
-
0.6
RuCoLA: Russian Corpus of Linguistic Acceptability
24hBERT
57.1
-
How to Train BERT with an Academic Budget
MLM+ del-span+ reorder
64.3%
-
CLEAR: Contrastive Learning for Sentence Representation
-
ELECTRA
68.2%
-
-
-
ERNIE 2.0 Large
63.5%
-
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
deberta-v3-base+tasksource
87.15%
-
tasksource: A Dataset Harmonization Framework for Streamlined NLP Multi-Task Learning and Evaluation
SqueezeBERT
46.5%
-
SqueezeBERT: What can computer vision teach NLP about efficient neural networks?
T5-XL 3B
67.1%
-
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
FLOATER-large
69%
-
Learning to Encode Position for Transformer with Continuous Dynamical Model
LM-CPPF RoBERTa-base
14.1%
-
LM-CPPF: Paraphrasing-Guided Data Augmentation for Contrastive Prompt-Based Few-Shot Fine-Tuning
StructBERTRoBERTa ensemble
69.2%
-
StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding
-
data2vec
60.3%
-
data2vec: A General Framework for Self-supervised Learning in Speech, Vision and Language
ERNIE
52.3%
-
ERNIE: Enhanced Language Representation with Informative Entities
Q8BERT (Zafrir et al., 2019)
65.0
-
Q8BERT: Quantized 8Bit BERT
T5-Small
41.0%
-
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
0 of 43 row(s) selected.
Previous
Next