HyperAI超神経
ホーム
ニュース
最新論文
チュートリアル
データセット
百科事典
SOTA
LLMモデル
GPU ランキング
学会
検索
サイトについて
日本語
HyperAI超神経
Toggle sidebar
サイトを検索…
⌘
K
ホーム
SOTA
Learning With Noisy Labels
Learning With Noisy Labels On Cifar 100N
Learning With Noisy Labels On Cifar 100N
評価指標
Accuracy (mean)
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
Columns
モデル名
Accuracy (mean)
Paper Title
Repository
Peer Loss
57.59
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
ILL
65.84
Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
F-div
57.10
When Optimizing $f$-divergence is Robust with Label Noise
PGDF
74.08
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
T-Revision
51.55
Are Anchor Points Really Indispensable in Label-Noise Learning?
VolMinNet
57.80
Provably End-to-end Label-Noise Learning without Anchor Points
CE
55.50
-
-
Backward-T
57.14
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Co-Teaching
60.37
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Positive-LS
55.84
Does label smoothing mitigate label noise?
-
CAL
61.73
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
ELR+
66.72
Early-Learning Regularization Prevents Memorization of Noisy Labels
Divide-Mix
71.13
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Negative-LS
58.59
To Smooth or Not? When Label Smoothing Meets Noisy Labels
CORES*
55.72
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR
58.94
Early-Learning Regularization Prevents Memorization of Noisy Labels
Forward-T
57.01
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
JoCoR
59.97
Combating noisy labels by agreement: A joint training method with co-regularization
CORES
61.15
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
GCE
56.73
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 24 row(s) selected.
Previous
Next