Learning With Noisy Labels On Cifar 10N 1
평가 지표
Accuracy (mean)
평가 결과
이 벤치마크에서 각 모델의 성능 결과
비교 표
모델 이름 | Accuracy (mean) |
---|---|
understanding-generalized-label-smoothing | 90.29 |
when-optimizing-f-divergence-is-robust-with-1 | 89.70 |
provably-end-to-end-label-noise-learning | 88.30 |
combating-noisy-labels-by-agreement-a-joint | 90.30 |
generative-noisy-label-learning-by-implicit | 91.97 |
190600189 | 88.33 |
making-deep-neural-networks-robust-to-label | 87.14 |
making-deep-neural-networks-robust-to-label | 86.88 |
sample-prior-guided-robust-model-learning-to | 96.01 |
how-does-disagreement-help-generalization | 89.70 |
peer-loss-functions-learning-from-noisy | 89.06 |
psscl-a-progressive-sample-selection | 96.17 |
learning-with-instance-dependent-label-noise-1 | 94.45 |
co-teaching-robust-training-of-deep-neural | 90.33 |
early-learning-regularization-prevents | 91.46 |
early-learning-regularization-prevents | 94.43 |
generalized-cross-entropy-loss-for-training | 87.61 |
dividemix-learning-with-noisy-labels-as-semi-1 | 90.18 |
imprecise-label-learning-a-unified-framework | 94.85 |
does-label-smoothing-mitigate-label-noise | 89.80 |
learning-with-instance-dependent-label-noise-1 | 89.66 |
promix-combating-label-noise-via-maximizing | 96.97 |
clusterability-as-an-alternative-to-anchor | 90.93 |
robust-training-under-label-noise-by-over | 95.28 |