Architecture Search
Neural Architecture Search (NAS) is a technique for automating the design of Artificial Neural Networks (ANN). Its goal is to automatically explore and optimize network structures through algorithms to discover more efficient and complex model architectures, thereby enhancing the performance and effectiveness of machine learning tasks. The application value of NAS lies in reducing the time cost associated with manual parameter tuning, improving the efficiency and accuracy of model design.
CIFAR-10
NAT-M4
CIFAR-10 Image Classification
EEEA-Net-C (b=5)+ CO
CIFAR-100
NAT-M4
CINIC-10
NAT-M4
DTD
NAT-M4
FGVC Aircraft
NAT-M4
Food-101
Balanced Mixture
ImageNet
DeepMAD-50M
LIDC-IDRI
NASLung (ours)
MNIST
NAS-Bench-101
FireFly
NAS-Bench-201
Improved FireFly Algorithme
NAS-Bench-201, CIFAR-10
DiNAS
NAS-Bench-201, CIFAR-100
IS-DARTS
NAS-Bench-201, ImageNet-16-120
CR-LSO
NAS-Bench-301
DiNAS
NATS-Bench Size, CIFAR-10
BossNAS
NATS-Bench Size, CIFAR-100
NATS-Bench Size, ImageNet16-120
NATS-Bench Topology, CIFAR-10
NATS-Bench Topology, CIFAR-100
NATS-Bench Topology, ImageNet16-120
GreenMachine-1
Oxford 102 Flowers
NAT-M4
Oxford-IIIT Pet Dataset
NAT-M4
Stanford Cars
NAT-M4
STL-10
NAT-M4