IEEE Access (Jan 2021)

NNR-GL: A Measure to Detect Co-Nonlinearity Based on Neural Network Regression Regularized by Group Lasso

  • Miho Ohsaki,
  • Naoya Kishimoto,
  • Hayato Sasaki,
  • Ryoji Ikeura,
  • Shigeru Katagiri,
  • Kei Ohnishi,
  • Yakub Sebastian,
  • Patrick Then

DOI
https://doi.org/10.1109/ACCESS.2021.3111105
Journal volume & issue
Vol. 9
pp. 132033 – 132052

Abstract

Read online

For finding keys to understand and elucidate a phenomenon, it is essential to detect dependences among variables, and so measures for that have been proposed. Correlation coefficient and its variants are most common, but they only detect a linear dependence (co-linearity) between two variables. Some recent measures can detect a nonlinear dependence (co-nonlinearity) by means of kernelization or segmentation. They are supposed to handle two variables only and open to discussion with regard to performance in detection and difficulty in setup. There is room for a novel measure based on Neural Networks (NNs), since usual NNs aim at prediction but not at variable dependence detection. For the high-performance detection of co-nonlinearities among multi variables, we propose a measure called NNR-GL based on Neural Network Regression (NNR) regularized by Group Lasso (GL). NNR-GL embodies the detection through multi-input single-output regression by NNR and regularization on the input layer by GL. NNR-GL then calculates how strong the detected co-nonlinearities are by unifying the regression performance and the weights on input variables. We conducted experiments using artificial data to examine the behaviors and fundamental effectiveness of NNR-GL. The performance was estimated by a comprehensive detection performance criterion (CDP-AUC in short), which is the mean of area under curves representing true positive and true negative detections. NNR-GL achieved the values of CDP-AUC from 0.7472 to 0.9681, where 0 means complete failure and 1 means complete success in detection. These values were consistently higher than those from 0.5972 to 0.9259 of the conventional measures for all the different conditions of dependence, data size, and noise rate. Consequently, the effectiveness and robustness of NNR-GL were clearly confirmed.

Keywords