HyperAI
HyperAI超神経
ホーム
プラットフォーム
ドキュメント
ニュース
論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
利用規約
プライバシーポリシー
日本語
HyperAI
HyperAI超神経
Toggle Sidebar
サイトを検索…
⌘
K
Command Palette
Search for a command to run...
プラットフォーム
ホーム
SOTA
共参照解消
Coreference Resolution On Winograd Schema
Coreference Resolution On Winograd Schema
評価指標
Accuracy
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Accuracy
Paper Title
PaLM 540B (fine-tuned)
100
PaLM: Scaling Language Modeling with Pathways
Vega v2 6B (KD-based prompt transfer)
98.6
Toward Efficient Language Model Pretraining and Downstream Adaptation via Self-Evolution: A Case Study on SuperGLUE
UL2 20B (fine-tuned)
98.1
UL2: Unifying Language Learning Paradigms
Turing NLR v5 XXL 5.4B (fine-tuned)
97.3
Toward Efficient Language Model Pretraining and Downstream Adaptation via Self-Evolution: A Case Study on SuperGLUE
ST-MoE-32B 269B (fine-tuned)
96.6
ST-MoE: Designing Stable and Transferable Sparse Expert Models
DeBERTa-1.5B
95.9
DeBERTa: Decoding-enhanced BERT with Disentangled Attention
T5-XXL 11B (fine-tuned)
93.8
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
ST-MoE-L 4.1B (fine-tuned)
93.3
ST-MoE: Designing Stable and Transferable Sparse Expert Models
RoBERTa-WinoGrande 355M
90.1
WinoGrande: An Adversarial Winograd Schema Challenge at Scale
Flan-T5 XXL (zero -shot)
89.82
Scaling Instruction-Finetuned Language Models
PaLM 540B (5-shot)
89.5
PaLM: Scaling Language Modeling with Pathways
PaLM 540B (0-shot)
89.1
PaLM: Scaling Language Modeling with Pathways
PaLM 2-M (1-shot)
88.1
PaLM 2 Technical Report
PaLM 2-L (1-shot)
86.9
PaLM 2 Technical Report
FLAN 137B (prompt-tuned)
86.5
Finetuned Language Models Are Zero-Shot Learners
PaLM 540B (1-shot)
86.3
PaLM: Scaling Language Modeling with Pathways
PaLM 2-S (1-shot)
84.6
PaLM 2 Technical Report
TTTTT 3B (fine-tuned)
84.6
TTTTTackling WinoGrande Schemas
RoBERTa-DPR 355M
83.1
WinoGrande: An Adversarial Winograd Schema Challenge at Scale
FLAN 137B (zero-shot)
80.8
Finetuned Language Models Are Zero-Shot Learners
0 of 82 row(s) selected.
Previous
Next
Coreference Resolution On Winograd Schema | SOTA | HyperAI超神経