Mathematics (Dec 2023)

First-Order Sparse TSK Nonstationary Fuzzy Neural Network Based on the Mean Shift Algorithm and the Group Lasso Regularization

  • Bingjie Zhang,
  • Jian Wang,
  • Xiaoling Gong,
  • Zhanglei Shi,
  • Chao Zhang,
  • Kai Zhang,
  • El-Sayed M. El-Alfy,
  • Sergey V. Ablameyko

DOI
https://doi.org/10.3390/math12010120
Journal volume & issue
Vol. 12, no. 1
p. 120

Abstract

Read online

Nonstationary fuzzy inference systems (NFIS) are able to tackle uncertainties and avoid the difficulty of type-reduction operation. Combining NFIS and neural network, a first-order sparse TSK nonstationary fuzzy neural network (SNFNN-1) is proposed in this paper to improve the interpretability/translatability of neural networks and the self-learning ability of fuzzy rules/sets. The whole architecture of SNFNN-1 can be considered as an integrated model of multiple sub-networks with a variation in center, variation in width or variation in noise. Thus, it is able to model both “intraexpert” and “interexpert” variability. There are two techniques adopted in this network: the Mean Shift-based fuzzy partition and the Group Lasso-based rule selection, which can adaptively generate a suitable number of clusters and select important fuzzy rules, respectively. Quantitative experiments on six UCI datasets demonstrate the effectiveness and robustness of the proposed model.

Keywords