IEEE Access (Jan 2021)

A Pruning Optimized Fast Learn++NSE Algorithm

  • Yong Chen,
  • Yuquan Zhu,
  • Haifeng Chen,
  • Yan Shen,
  • Zhao Xu

DOI
https://doi.org/10.1109/ACCESS.2021.3118568
Journal volume & issue
Vol. 9
pp. 150733 – 150743

Abstract

Read online

Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this research field. And a pruning version, named Learn++.NSE-Error-based, was given for accumulated big data to improve the learning efficiency. However, the studies have found that the Learn++.NSE-Error-based algorithm often encounters a situation that the newly generated base classifier is pruned in the next integration, which reduces the accuracy of the ensemble classifier. The newly generated base classifier is very important in the next ensemble learning and should be retained. Therefore, the two latest base classifiers are reserved without being pruned, and a new pruning algorithm named NewLearn++.NSE-Error-based was proposed. The experimental results on the generated dataset and the real-world dataset show that NewLearn++.NSE-Error-based can further improve the accuracy of the ensemble classifier under the premise of obtaining the same time complexity as Learn++.NSE algorithm. It is suitable for fast classification learning of long-term accumulated big data.

Keywords