IET Computer Vision (Aug 2021)

Control the number of skip‐connects to improve robustness of the NAS algorithm

  • Bao Feng Zhang,
  • Guo Qiang Zhou

DOI
https://doi.org/10.1049/cvi2.12036
Journal volume & issue
Vol. 15, no. 5
pp. 356 – 365

Abstract

Read online

Abstract Recently, the gradient‐based neural architecture search has made remarkable progress with the characteristics of high efficiency and fast convergence. However, two common problems in the gradient‐based NAS algorithms are found. First, with the increase in the raining time, the NAS algorithm tends to skip‐connect operation, leading to performance degradation and instability results. Second, another problem is no reasonable allocation of computing resources on valuable candidate network models. The above two points lead to the difficulty in searching the optimal sub‐network and poor stability. To address them, the trick of pre‐training the super‐net is applied, so that each operation has an equal opportunity to develop its strength, which provides a fair competition condition for the convergence of the architecture parameters. In addition, a skip‐controller is proposed to ensure each sampled sub‐network with an appropriate number of skip‐connects. The experiments were performed on three mainstream datasets CIFAR‐10, CIFAR‐100 and ImageNet, in which the improved method achieves comparable results with higher accuracy and stronger robustness.

Keywords