IEEE Access (Jan 2019)

An Evolutionary Approach to Compact DAG Neural Network Optimization

  • Carter Chiu,
  • Justin Zhan

DOI
https://doi.org/10.1109/ACCESS.2019.2954795
Journal volume & issue
Vol. 7
pp. 178331 – 178341

Abstract

Read online

Neural networks are the cutting edge of artificial intelligence, demonstrated to reliably outperform other techniques in machine learning. Within the domain of neural networks, many different classes of architectures have been developed for various tasks in specific subfields, as well as a multitude of diversity in the way of activation functions, loss functions, and other such hyperparameters. These networks are often large and computationally expensive to train and deploy, restricting their utility. Furthermore, the fundamental theory behind the effectiveness of particular network architectures and hyperparameters are often not well understood, and as such, practitioners frequently resort to trial-and-error techniques to optimize their model performance. To address these concerns, we propose the use of compact directed acyclic graph neural networks (DAG-NNs) and an evolutionary approach for automating the optimization of their structure and parameters. Our experimental results demonstrate that our approach consistently outperforms conventional neural networks, even while employing fewer nodes.

Keywords