HyperAI
HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Apprentissage avec étiquettes bruitées
Learning With Noisy Labels On Cifar 10N Worst
Learning With Noisy Labels On Cifar 10N Worst
Métriques
Accuracy (mean)
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
Accuracy (mean)
Paper Title
Repository
Co-Teaching
83.83
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
-
ELR
83.58
Early-Learning Regularization Prevents Memorization of Noisy Labels
-
GNL
86.99
Partial Label Supervision for Agnostic Generative Noisy Label Learning
-
T-Revision
80.48
Are Anchor Points Really Indispensable in Label-Noise Learning?
-
JoCoR
83.37
Combating noisy labels by agreement: A joint training method with co-regularization
-
PSSCL
95.12
PSSCL: A progressive sample selection framework with contrastive loss designed for noisy labels
PGDF
93.65
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
-
CE
77.69
-
-
Negative-LS
82.99
Understanding Generalized Label Smoothing when Learning with Noisy Labels
-
Forward-T
79.79
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
-
Co-Teaching+
83.26
How does Disagreement Help Generalization against Label Corruption?
-
F-div
82.53
When Optimizing $f$-divergence is Robust with Label Noise
-
ProMix
96.16
ProMix: Combating Label Noise via Maximizing Clean Sample Utility
-
GCE
80.66
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
-
Divide-Mix
92.56
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
-
CAL
85.36
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
-
VolMinNet
80.53
Provably End-to-end Label-Noise Learning without Anchor Points
-
CORES
83.60
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
-
ELR+
91.09
Early-Learning Regularization Prevents Memorization of Noisy Labels
-
Peer Loss
82.53
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
-
0 of 25 row(s) selected.
Previous
Next
Learning With Noisy Labels On Cifar 10N Worst | SOTA | HyperAI