HyperAIHyperAI

Command Palette

Search for a command to run...

LayerNAS: Neural Architecture Search in Polynomial Complexity

Yicheng Fan Dana Alon Jingyue Shen Daiyi Peng Keshav Kumar Yun Long Xin Wang Fotis Iliopoulos Da-Cheng Juan Erik Vee

Abstract

Neural Architecture Search (NAS) has become a popular method for discovering effective model architectures, especially for target hardware. As such, NAS methods that find optimal architectures under constraints are essential. In our paper, we propose LayerNAS to address the challenge of multi-objective NAS by transforming it into a combinatorial optimization problem, which effectively constrains the search complexity to be polynomial. For a model architecture with LLL layers, we perform layerwise-search for each layer, selecting from a set of search options S\mathbb{S}S. LayerNAS groups model candidates based on one objective, such as model size or latency, and searches for the optimal model based on another objective, thereby splitting the cost and reward elements of the search. This approach limits the search complexity to O(HSL)O(H \cdot |\mathbb{S}| \cdot L)O(HSL), where HHH is a constant set in LayerNAS. Our experiments show that LayerNAS is able to consistently discover superior models across a variety of search spaces in comparison to strong baselines, including search spaces derived from NATS-Bench, MobileNetV2 and MobileNetV3.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp