IEEE Access (Jan 2023)

Neural Architecture Search Benchmarks: Insights and Survey

  • Krishna Teja Chitty-Venkata,
  • Murali Emani,
  • Venkatram Vishwanath,
  • Arun K. Somani

DOI
https://doi.org/10.1109/ACCESS.2023.3253818
Journal volume & issue
Vol. 11
pp. 25217 – 25236

Abstract

Read online

Neural Architecture Search (NAS), a promising and fast-moving research field, aims to automate the architectural design of Deep Neural Networks (DNNs) to achieve better performance on the given task and dataset. NAS methods have been very successful in discovering efficient models for various Computer Vision, Natural Language Processing, etc. The major obstacles to the advancement of NAS techniques are the demand for large computation resources and fair evaluation of various search methods. The differences in training pipeline and setting make it challenging to compare the efficiency of two NAS algorithms. A large number of NAS Benchmarks to simulate the architecture evaluation in seconds have been released over the last few years to ease the computation burden of training neural networks and can aid in the unbiased assessment of different search methods. This paper provides an extensive review of several publicly available NAS Benchmarks in the literature. We provide technical details and a deeper understanding of each benchmark and point out future directions.

Keywords