HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Learning With Noisy Labels
Learning With Noisy Labels On Cifar 100N
Learning With Noisy Labels On Cifar 100N
评估指标
Accuracy (mean)
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy (mean)
Paper Title
Repository
Peer Loss
57.59
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
ILL
65.84
Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
F-div
57.10
When Optimizing $f$-divergence is Robust with Label Noise
PGDF
74.08
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
T-Revision
51.55
Are Anchor Points Really Indispensable in Label-Noise Learning?
VolMinNet
57.80
Provably End-to-end Label-Noise Learning without Anchor Points
CE
55.50
-
-
Backward-T
57.14
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Co-Teaching
60.37
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Positive-LS
55.84
Does label smoothing mitigate label noise?
-
CAL
61.73
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
ELR+
66.72
Early-Learning Regularization Prevents Memorization of Noisy Labels
Divide-Mix
71.13
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Negative-LS
58.59
To Smooth or Not? When Label Smoothing Meets Noisy Labels
CORES*
55.72
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR
58.94
Early-Learning Regularization Prevents Memorization of Noisy Labels
Forward-T
57.01
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
JoCoR
59.97
Combating noisy labels by agreement: A joint training method with co-regularization
CORES
61.15
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
GCE
56.73
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 24 row(s) selected.
Previous
Next