Learning With Noisy Labels On Cifar 10N Worst
Metriken
Accuracy (mean)
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Vergleichstabelle
Modellname | Accuracy (mean) |
---|---|
co-teaching-robust-training-of-deep-neural | 83.83 |
early-learning-regularization-prevents | 83.58 |
generative-noisy-label-learning-by-implicit | 86.99 |
190600189 | 80.48 |
combating-noisy-labels-by-agreement-a-joint | 83.37 |
psscl-a-progressive-sample-selection | 95.12 |
sample-prior-guided-robust-model-learning-to | 93.65 |
Modell 8 | 77.69 |
understanding-generalized-label-smoothing-1 | 82.99 |
making-deep-neural-networks-robust-to-label | 79.79 |
how-does-disagreement-help-generalization | 83.26 |
when-optimizing-f-divergence-is-robust-with-1 | 82.53 |
promix-combating-label-noise-via-maximizing | 96.16 |
generalized-cross-entropy-loss-for-training | 80.66 |
dividemix-learning-with-noisy-labels-as-semi-1 | 92.56 |
clusterability-as-an-alternative-to-anchor | 85.36 |
provably-end-to-end-label-noise-learning | 80.53 |
learning-with-instance-dependent-label-noise-1 | 83.60 |
early-learning-regularization-prevents | 91.09 |
peer-loss-functions-learning-from-noisy | 82.53 |
robust-training-under-label-noise-by-over | 93.24 |
does-label-smoothing-mitigate-label-noise | 82.76 |
making-deep-neural-networks-robust-to-label | 77.61 |
learning-with-instance-dependent-label-noise-1 | 91.66 |
imprecise-label-learning-a-unified-framework | 93.58 |