HyperAI

Learning With Noisy Labels On Cifar 10N

Metrics

Accuracy (mean)

Results

Performance results of various models on this benchmark

Model Name
Accuracy (mean)
Paper TitleRepository
SOP+95.61Robust Training under Label Noise by Over-parameterization
ILL95.47Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*95.25Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
CE87.77--
CAL91.97Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Forward-T88.24Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
PGDF96.11Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
Co-Teaching91.20Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
ELR+94.83Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL92.57Partial Label Supervision for Agnostic Generative Noisy Label Learning
F-div91.64When Optimizing $f$-divergence is Robust with Label Noise
ProMix97.39ProMix: Combating Label Noise via Maximizing Clean Sample Utility
ELR92.38Early-Learning Regularization Prevents Memorization of Noisy Labels
Backward-T88.13Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Peer Loss90.75Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
Positive-LS91.57Does label smoothing mitigate label noise?-
GCE87.85Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
T-Revision88.52Are Anchor Points Really Indispensable in Label-Noise Learning?
JoCoR91.44Combating noisy labels by agreement: A joint training method with co-regularization
CORES91.23Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
0 of 26 row(s) selected.