HyperAI
HyperAI초신경
홈
플랫폼
문서
뉴스
연구 논문
튜토리얼
데이터셋
백과사전
SOTA
LLM 모델
GPU 랭킹
컨퍼런스
전체 검색
소개
서비스 약관
개인정보 처리방침
한국어
HyperAI
HyperAI초신경
Toggle Sidebar
전체 사이트 검색...
⌘
K
Command Palette
Search for a command to run...
플랫폼
홈
SOTA
신경망 구조 탐색
Neural Architecture Search On Cifar 10
Neural Architecture Search On Cifar 10
평가 지표
Parameters
Search Time (GPU days)
Top-1 Error Rate
평가 결과
이 벤치마크에서 각 모델의 성능 결과
Columns
모델 이름
Parameters
Search Time (GPU days)
Top-1 Error Rate
Paper Title
GDAS
-
0.21
3.4%
Searching for A Robust Neural Architecture in Four GPU Hours
Bonsai-Net
2.9M
0.10
3.35%
Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners
Net2 (2)
-
-
3.3%
Efficacy of Neural Prediction-Based Zero-Shot NAS
μDARTS
-
0.1
3.277%
$μ$DARTS: Model Uncertainty-Aware Differentiable Architecture Search
NN-MASS- CIFAR-C
3.82M
0
3.18%
How does topology of neural architectures impact gradient propagation and model performance?
NN-MASS- CIFAR-A
5.02M
0
3.0%
How does topology of neural architectures impact gradient propagation and model performance?
DARTS (first order)
3.3
1.5
3%
DARTS: Differentiable Architecture Search
NASGEP
-
1
2.82%
Optimizing Neural Architecture Search using Limited GPU Time in a Dynamic Search Space: A Gene Expression Programming Approach
AlphaX-1 (cutout NASNet)
-
224
2.82%
AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search
DARTS (second order)
3.3
4
2.76%
DARTS: Differentiable Architecture Search
SETN (T=1K) + CutOut
-
1.8
2.69%
One-Shot Neural Architecture Search via Self-Evaluated Template Network
DARTS-PRIME
3.7M
0.5
2.62%
DARTS-PRIME: Regularization and Scheduling Improve Constrained Optimization in Differentiable NAS
NAT-M1
4.3M
1.0
2.6%
Neural Architecture Transfer
PC-DARTS
3.6M
0.1
2.57%
PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
arch2vec
3.6M
10.5
2.56%
Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?
FairDARTS-a
2.8M
0.25
2.54%
Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search
MSR-DARTS
4.0M
0.3
2.54%
MSR-DARTS: Minimum Stable Rank of Differentiable Architecture Search
Soft Parameter Sharing
-
0.7
2.53%
Learning Implicitly Recurrent CNNs Through Parameter Sharing
β-DARTS
-
-
2.53%
$β$-DARTS: Beta-Decay Regularization for Differentiable Architecture Search
TNASP
3.7M
0.3
2.52%
TNASP: A Transformer-based NAS Predictor with a Self-evolution Framework
0 of 41 row(s) selected.
Previous
Next
Neural Architecture Search On Cifar 10 | SOTA | HyperAI초신경