IEEE Access (Jan 2024)

Method for Expanding Search Space With Hybrid Operations in DynamicNAS

  • Iksoo Shin,
  • Changsik Cho,
  • Seon-Tae Kim

DOI
https://doi.org/10.1109/ACCESS.2024.3350732
Journal volume & issue
Vol. 12
pp. 10242 – 10253

Abstract

Read online

Recently, a novel neural architecture search method, which is referred to as DynamicNAS (Dynamic Neural Architecture Search) in this paper, has shown great potential. Not only can various sizes of models be trained with a single training session through DynamicNAS, but the subnets trained by DynamicNAS show improved performance compared to the subnets trained by conventional methods. Although DynamicNAS has many strengths compared to conventional NAS, it has the drawback that different types of operations cannot be used simultaneously within a layer as a search space. In this paper, we present a method that allows DynamicNAS to use different types of operations in a layer as a search space, without undermining the benefits of DynamicNAS, such as one-time training and superior subnet performance. Our experiments show that common operation mixing methods, such as convex combination and set sampling, are inadequate for the problem, although they have a structure that is similar to the proposed method. The proposed method finds, from a supernet of hybrid operations, a superior architecture that cannot be found from a single-operation supernet.

Keywords