HyperAI초신경
홈
뉴스
최신 연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
한국어
HyperAI초신경
Toggle sidebar
전체 사이트 검색...
⌘
K
홈
SOTA
Learning With Noisy Labels
Learning With Noisy Labels On Cifar 10N
Learning With Noisy Labels On Cifar 10N
평가 지표
Accuracy (mean)
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Accuracy (mean)
Paper Title
Repository
SOP+
95.61
Robust Training under Label Noise by Over-parameterization
ILL
95.47
Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations
CORES*
95.25
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
CE
87.77
-
-
CAL
91.97
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels
Forward-T
88.24
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
PGDF
96.11
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels
Co-Teaching
91.20
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
ELR+
94.83
Early-Learning Regularization Prevents Memorization of Noisy Labels
GNL
92.57
Partial Label Supervision for Agnostic Generative Noisy Label Learning
F-div
91.64
When Optimizing $f$-divergence is Robust with Label Noise
ProMix
97.39
ProMix: Combating Label Noise via Maximizing Clean Sample Utility
ELR
92.38
Early-Learning Regularization Prevents Memorization of Noisy Labels
Backward-T
88.13
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Peer Loss
90.75
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
Positive-LS
91.57
Does label smoothing mitigate label noise?
-
GCE
87.85
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
T-Revision
88.52
Are Anchor Points Really Indispensable in Label-Noise Learning?
JoCoR
91.44
Combating noisy labels by agreement: A joint training method with co-regularization
CORES
91.23
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
0 of 26 row(s) selected.
Previous
Next