Generative Adversarial Neural Architecture Search with Importance Sampling

Despite the empirical success of neural architecture search (NAS) algorithms in deep learning applications, the optimality, reproducibility and cost of NAS schemes remain hard to be assessed. The factor of search space or experimental procedure adopted has further affected a fair comparison between specific search strategies. In this paper, we revisit the search strategies in NAS and propose Generative Adversarial NAS (GA-NAS). Motivated by the fact that the search space grows exponentially as a function of the architecture size, GA-NAS is theoretically inspired by importance sampling for rare event simulation, and iteratively refits a generator to previously discovered top architectures, thus increasingly focusing on important parts of the search space. GA-NAS adopts an efficient adversarial learning approach, where the generator is not trained based on a large number of observations on architecture performance, but based on the relative prediction made by a discriminator, thus significantly reducing the number of evaluations required. Extensive experiments show that GA-NAS beats the best published results in comparison to a range of state-of-the-art search algorithms proposed for NAS on public benchmarks including NAS-Bench-101, NAS-Bench-201, and NAS-Bench-301. We further show that GA-NAS can handle ad-hoc search objectives and search spaces. Specifically, on the EfficientNet macro search space, our algorithm is able to find a new architecture with higher ImageNet accuracy and a lower number of parameters compared to EfficientNet-B0.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods