HyperAI

Learning With Noisy Labels On Cifar 100N

Métriques

Accuracy (mean)

Résultats

Résultats de performance de divers modèles sur ce benchmark

Nom du modèle
Accuracy (mean)
Paper TitleRepository
Peer Loss57.59Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
ILL65.84Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
F-div57.10When Optimizing $f$-divergence is Robust with Label Noise
PGDF74.08Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
T-Revision51.55Are Anchor Points Really Indispensable in Label-Noise Learning?
VolMinNet57.80Provably End-to-end Label-Noise Learning without Anchor Points
CE55.50--
Backward-T57.14Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Co-Teaching60.37Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Positive-LS55.84Does label smoothing mitigate label noise?-
CAL61.73Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
ELR+66.72Early-Learning Regularization Prevents Memorization of Noisy Labels
Divide-Mix71.13DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Negative-LS58.59To Smooth or Not? When Label Smoothing Meets Noisy Labels
CORES*55.72Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
ELR58.94Early-Learning Regularization Prevents Memorization of Noisy Labels
Forward-T57.01Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
JoCoR59.97Combating noisy labels by agreement: A joint training method with co-regularization
CORES61.15Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
GCE56.73Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
0 of 24 row(s) selected.