HyperAI
HyperAI
Home
Console
Docs
News
Papers
Tutorials
Datasets
Wiki
SOTA
LLM Models
GPU Leaderboard
Events
Search
About
Terms of Service
Privacy Policy
English
HyperAI
HyperAI
Toggle Sidebar
Search the site…
⌘
K
Command Palette
Search for a command to run...
Console
Home
SOTA
Neural Architecture Search
Neural Architecture Search On Cifar 100 1
Neural Architecture Search On Cifar 100 1
Metrics
FLOPS
PARAMS
Percentage Error
Results
Performance results of various models on this benchmark
Columns
Model Name
FLOPS
PARAMS
Percentage Error
Paper Title
μDARTS
-
602M
19.39
$μ$DARTS: Model Uncertainty-Aware Differentiable Architecture Search
NASGEP
-
-
18.83
Optimizing Neural Architecture Search using Limited GPU Time in a Dynamic Search Space: A Gene Expression Programming Approach
DARTS-PRIME
-
3.16M
17.44
DARTS-PRIME: Regularization and Scheduling Improve Constrained Optimization in Differentiable NAS
DU-DARTS
-
3.1M
16.74
DU-DARTS: Decreasing the Uncertainty of Differentiable Architecture Search
β-DARTS
-
-
16.52
$β$-DARTS: Beta-Decay Regularization for Differentiable Architecture Search
ZenNet-2.0M
487M
2.0M
15.6
Zen-NAS: A Zero-Shot NAS for High-Performance Deep Image Recognition
NAT-M1
261M
3.8M
14.0
Neural Architecture Transfer
MUXNet-m
200M
2.1M
13.9
MUXConv: Information Multiplexing in Convolutional Neural Networks
NAT-M2
398M
6.4M
12.5
Neural Architecture Transfer
NAT-M3
492M
7.8M
12.3
Neural Architecture Transfer
DNA-c
-
-
11.7
Blockwisely Supervised Neural Architecture Search with Knowledge Distillation
NAT-M4
796M
9.0M
11.7
Neural Architecture Transfer
Balanced Mixture
-
-
-
Balanced Mixture of SuperNets for Learning the CNN Pooling Architecture
0 of 13 row(s) selected.
Previous
Next