HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Learning With Noisy Labels
Learning With Noisy Labels On Cifar 10N Worst
Learning With Noisy Labels On Cifar 10N Worst
评估指标
Accuracy (mean)
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy (mean)
Paper Title
Repository
Co-Teaching
83.83
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
ELR
83.58
Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL
86.99
Partial Label Supervision for Agnostic Generative Noisy Label Learning
T-Revision
80.48
Are Anchor Points Really Indispensable in Label-Noise Learning?
JoCoR
83.37
Combating noisy labels by agreement: A joint training method with co-regularization
PSSCL
95.12
PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels
PGDF
93.65
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
CE
77.69
-
-
Negative-LS
82.99
Understanding Generalized Label Smoothing when Learning with Noisy Labels
-
Forward-T
79.79
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Co-Teaching+
83.26
How does Disagreement Help Generalization against Label Corruption?
F-div
82.53
When Optimizing $f$-divergence is Robust with Label Noise
ProMix
96.16
ProMix: Combating Label Noise via Maximizing Clean Sample Utility
GCE
80.66
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
Divide-Mix
92.56
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
CAL
85.36
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
VolMinNet
80.53
Provably End-to-end Label-Noise Learning without Anchor Points
CORES
83.60
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR+
91.09
Early-Learning Regularization Prevents Memorization of Noisy Labels
Peer Loss
82.53
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
0 of 25 row(s) selected.
Previous
Next